• James C, Pappalardo L, Sîrbu A, Simini F. Prediction of next career moves from scientific profiles. arXiv preprint. 2018. arXiv:1802.04830.

  • Yang Y, Zhan D-C, Jiang Y. Which one will be next? An analysis of talent demission. 2018.

  • Zhao Y, Hryniewicki MK, Cheng F, Fu B, Zhu X. Employee turnover prediction with machine learning: a reliable approach. In: Proceedings of SAI intelligent systems conference. Springer;2018. p. 737–58.

  • Liu Y, Zhang L, Nie L, Yan Y, Rosenblum DS. Fortune teller: predicting your career path. In: Thirtieth AAAI conference on artificial intelligence. 2016.

  • Mimno D, McCallum A. Modeling career path trajectories. Citeseer; 2008.

  • Chen Z. Mining individual behavior pattern based on significant locations and spatial trajectories. In: 2012 IEEE international conference on pervasive computing and communications workshops. IEEE;2012. p. 540–1.

  • Cetintas S, Rogati M, Si L, Fang Y. Identifying similar people in professional social networks with discriminative probabilistic models. In: Proceedings of the 34th international ACM SIGIR conference on research and development in information retrieval. 2011. p. 1209–10.

  • Peters ME, Neumann M, Iyyer M, Gardner M, Clark C, Lee K, Zettlemoyer L. Deep contextualized word representations. arXiv preprint. 2018. arXiv:1802.05365.

  • Liu L, Shang J, Ren X, Xu FF, Gui H, Peng J, Han J. Empower sequence labeling with task-aware neural language model. In: Thirty-second AAAI conference on artificial intelligence; 2018.

  • Liu J, Ng YC, Wood KL, Lim KH. IPOD: a large-scale industrial and professional occupation dataset. In: Conference companion publication of the 2020 on computer supported cooperative work and social computing. 2020. p. 323–8.

  • Lou Y, Ren R, Zhao Y. A machine learning approach for future career planning. Citeseer, Technical report; 2010.

  • Paparrizos I, Cambazoglu BB, Gionis A. Machine learned job recommendation. In: Proceedings of the fifth ACM conference on recommender systems. ACM; 2011. p. 325–8.

  • Zhang Y, Yang C, Niu Z. A research of job recommendation system based on collaborative filtering. In: 2014 seventh international symposium on computational intelligence and design, vol. 1. IEEE; 2014. p. 533–8.

  • Li L, Jing H, Tong H, Yang J, He Q, Chen B-C. Nemo: next career move prediction with contextual embedding. In: Proceedings of the 26th international conference on world wide web companion. International World Wide Web Conferences Steering Committee; 2017. p. 505–13.

  • Li H, Ge Y, Zhu H, Xiong H, Zhao H. Prospecting the career development of talents: a survival analysis perspective. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining. ACM; 2017. p. 917–25.

  • Yang S, Korayem M, AlJadda K, Grainger T, Natarajan S. Combining content-based and collaborative filtering for job recommendation system: a cost-sensitive statistical relational learning approach. Knowl Based Syst. 2017;136:37–45.

    Article 

    Google Scholar
     

  • Zhu C, Zhu H, Xiong H, Ma C, Xie F, Ding P, Li P. Person-job fit: adapting the right talent for the right job with joint representation learning. ACM Trans Manag Inf Syst (TMIS). 2018;9(3):12.


    Google Scholar
     

  • Xu H, Yu Z, Guo B, Teng M, Xiong H. Extracting job title hierarchy from career trajectories: a bayesian perspective. In: IJCAI. 2018. p. 3599–605.

  • Qin C, Zhu H, Xu T, Zhu C, Jiang L, Chen E, Xiong H. Enhancing person-job fit for talent recruitment: an ability-aware neural network approach. In: The 41st international ACM SIGIR conference on research & development in information retrieval. ACM; 2018. p. 25–34.

  • Lim E-P, Lee W-C, Tian Y, Hung C-C. Are you on the right track? Learning career tracks for job movement analysis. In: Workshop on data science for human capital management (DSHCM2018). DSHCM; 2018. p. 1–16.

  • Shen D, Zhu H, Zhu C, Xu T, Ma C, Xiong H. A joint learning approach to intelligent job interview assessment. In: IJCAI. 2018. p. 3542–8.

  • Zhang L, Zhu H, Xu T, Zhu C, Qin C, Xiong H, Chen E. Large-scale talent flow forecast with dynamic latent factor model. In: The world wide web conference. 2019. p. 2312–22.

  • Nigam A, Roy A, Singh H, Waila H. Job recommendation through progression of job selection. In: 2019 IEEE 6th international conference on cloud computing and intelligence systems (CCIS). IEEE; 2019. p. 212–6.

  • Meng Q, Zhu H, Xiao K, Zhang L, Xiong H. A hierarchical career-path-aware neural network for job mobility prediction. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 2019. p. 14–24.

  • Van Huynh T, Van Nguyen K, Nguyen NL-T, Nguyen AG-T. Job prediction: from deep neural network models to applications. In: 2020 RIVF international conference on computing and communication technologies (RIVF). IEEE; 2020. p. 1–6.

  • Gugnani A, Misra H. Implicit skills extraction using document embedding and its use in job recommendation. In: Proceedings of the AAAI conference on artificial intelligence, vol. 34. 2020. p. 13286–93.

  • Alanoca HA, Vidal AA, Saire JEC. Curriculum vitae recommendation based on text mining. arXiv preprint. 2020. arXiv:2007.11053.

  • Zhang L, Zhou D, Zhu H, Xu T, Zha R, Chen E, Xiong H. Attentive heterogeneous graph embedding for job mobility prediction. In: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining. 2021. p. 2192–201.

  • Finkel JR, Grenager T, Manning C. Incorporating non-local information into information extraction systems by Gibbs sampling. In: Proceedings of the 43rd annual meeting on association for computational linguistics. Association for Computational Linguistics; 2005. p. 363–70.

  • Sang EF, De Meulder F. Introduction to the CoNLL-2003 shared task: language-independent named entity recognition. 2003. arXiv:cs/0306050.

  • Weischedel R, Palmer M, Marcus M, Hovy E, Pradhan S, Ramshaw L, Xue N, Taylor A, Kaufman J, Franchini M, et al. Ontonotes release 5.0 ldc2013t19. Philadelphia: Linguistic Data Consortium. 2013. p. 23.

  • Borchmann Ł, Gretkowski A, Gralinski F. Approaching nested named entity recognition with parallel LSTM-CRFs. In: Proceedings of the PolEval 2018 workshop. 2018. p. 63.

  • Viera AJ, Garrett JM, et al. Understanding interobserver agreement: the kappa statistic. Fam Med. 2005;37(5):360–3.


    Google Scholar
     

  • Artstein R, Poesio M. Inter-coder agreement for computational linguistics. Comput Linguist. 2008;34(4):555–96.

    Article 

    Google Scholar
     

  • Ratinov L, Roth D. Design challenges and misconceptions in named entity recognition. In: Proceedings of the thirteenth conference on computational natural language learning. CoNLL ’09. Stroudsburg: Association for Computational Linguistics. 2009. p. 147–55. http://dl.acm.org/citation.cfm?id=1596374.1596399.

  • Massoni S, Olteanu M, Rousset P. Career-path analysis using optimal matching and self-organizing maps. In: International workshop on self-organizing maps. Springer; 2009. p. 154–62.

  • Malinowski J, Keim T, Wendt O, Weitzel T. Matching people and jobs: a bilateral recommendation approach. In: Proceedings of the 39th annual Hawaii international conference on system sciences (HICSS’06), vol. 6. IEEE; 2006. p. 137.

  • Bengio Y, Ducharme R, Vincent P, Jauvin C. A neural probabilistic language model. J Mach Learn Res. 2003;3(Feb):1137–55.

    MATH 

    Google Scholar
     

  • Turian J, Ratinov L, Bengio Y. Word representations: a simple and general method for semi-supervised learning. In: Proceedings of the 48th annual meeting of the association for computational linguistics. Association for Computational Linguistics; 2010. p. 384–94.

  • Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. arXiv preprint. 2013. arXiv:1301.3781.

  • Pennington J, Socher R, Manning C. Glove: global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). 2014. p. 1532–43.

  • Bojanowski P, Grave E, Joulin A, Mikolov T. Enriching word vectors with subword information. Trans Assoc Comput Linguist. 2017;5:135–46.

    Article 

    Google Scholar
     

  • Akbik A, Blythe D, Vollgraf R. Contextual string embeddings for sequence labeling. In: Proceedings of the 27th international conference on computational linguistics. 2018. p. 1638–49.

  • Liu J, Singhal T, Blessing LTM, Wood KL, Lim KH. Crisisbert: a robust transformer for crisis classification and contextual crisis embedding. In: Proceedings of the 32nd ACM conference on hypertext and social media (HT’21). 2021. p. 133–41.

  • Singhal T, Liu J, Blessing LT, Lim KH. Analyzing scientific publications using domain-specific word embedding and topic modelling. In: 2021 IEEE international conference on big data (Big Data). IEEE; 2021. p. 4965–73.

  • Kumar S, Zymbler M. A machine learning approach to analyze customer satisfaction from airline tweets. J Big Data. 2019;6(1):1–16.

    Article 

    Google Scholar
     

  • Li M, Lim KH. Geotagging social media posts to landmarks using hierarchical BERT (student abstract). In: Proceedings of the thirty-sixth AAAI conference on artificial intelligence (AAAI’22). 2022.

  • Solanki P, Harwood A, et al. User identification across social networking sites using user profiles and posting patterns. In: 2021 international joint conference on neural networks (IJCNN). IEEE; 2021. p. 1–8.

  • Pek YN, Lim KH. Identifying and understanding business trends using topic models with word embedding. In: Proceedings of the 2019 IEEE international conference on big data (BigData’19). 2019. p. 6177–9.

  • Ho NL, Lim KH. User preferential tour recommendation based on POI-embedding methods. In: Proceedings of the 26th international conference on intelligent user interfaces companion (IUI’21). 2021. p. 46–8.

  • Mu W, Lim KH, Liu J, Karunasekera S, Falzon L, Harwood A. A clustering-based topic model using word networks and word embeddings. J Big Data. 2022;9(1):1–38.

    Article 

    Google Scholar
     

  • Devlin J, Chang M-W, Lee K, Toutanova K. BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint. 2018. arXiv:1810.04805.

  • Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I. Language models are unsupervised multitask learners. OpenAI Blog. 2019;1(8):9.


    Google Scholar
     

  • Lample G, Ballesteros M, Subramanian S, Kawakami K, Dyer C. Neural architectures for named entity recognition. arXiv preprint. 2016. arXiv:1603.01360.

  • Reimers N, Gurevych I, Reimers N, Gurevych I, Thakur N, Reimers N, Daxenberger J, Gurevych I, Reimers N, Gurevych I, et al. Sentence-BERT: sentence embeddings using siamese BERT-networks. In: Proceedings of the 2019 conference on empirical methods in natural language processing. Association for Computational Linguistics; 2019.

  • Zhang Y, He R, Liu Z, Lim KH, Bing L. An unsupervised sentence embedding method by mutual information maximization. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP’20). 2020. p. 1601–10.

  • Forney GD. The Viterbi algorithm. Proc IEEE. 1973;61(3):268–78.

    MathSciNet 
    Article 

    Google Scholar
     

  • Kazama J, Torisawa K. Inducing gazetteers for named entity recognition by large-scale clustering of dependency relations. In: Proceedings of ACL-08: HLT. 2008. p. 407–15.

  • Saha SK, Sarkar S, Mitra P. Gazetteer preparation for named entity recognition in Indian languages. In: Proceedings of the 6th workshop on asian language resources. 2008.

  • Nallapati R, Surdeanu M, Manning C. Blind domain transfer for named entity recognition using generative latent topic models. In: Proceedings of the NIPS 2010 workshop on transfer learning via rich generative models. 2010. p. 281–9.

  • Mukund S, Srihari RK. Ne tagging for Urdu based on bootstrap POS learning. In: Proceedings of the third international workshop on cross lingual information access: addressing the information need of multilingual societies. Association for Computational Linguistics; 2009. p. 61–9.

  • Ramshaw LA, Marcus MP. Text chunking using transformation-based learning. In: Natural language processing using very large corpora. Springer; 1999. p. 157–76.

  • Akhtar A. Singapore and Hong Kong have overtaken the US as the most competitive economies. Here’s how 25 countries rank. Business Insider. 2019. https://www.businessinsider.com/most-competitive-economies-in-the-world-2019-5.

  • Lafferty J, McCallum A, Pereira FC. Conditional random fields: probabilistic models for segmenting and labeling sequence data. 2001.

  • Kingma DP, Salimans T, Welling M. Variational dropout and the local reparameterization trick. In: Advances in neural information processing systems. 2015. p. 2575–83.

  • Martin K, Obdulia R, Florian L, Miguel V, David S, Zhiyong L, Robert L, Yanan L, Donghong J, Lowe DM. The CHEMDNER corpus of chemicals and drugs and its annotation principles. J Cheminform. 2015;7(S1):2.

    Article 

    Google Scholar
     

  • Rajpurkar P, Jia R, Liang P. Know what you don’t know: unanswerable questions for squad. In: Proceedings of the 56th annual meeting of the association for computational linguistics, vol. 2 (short papers). 2018. https://doi.org/10.18653/v1/p18-2124.

  • Rajpurkar P, Zhang J, Lopyrev K, Liang P. Squad: 100,000+ questions for machine comprehension of text. In: Proceedings of the 2016 conference on empirical methods in natural language processing. 2016. https://doi.org/10.18653/v1/d16-1264.

  • Rights and permissions

    Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

    Disclaimer:

    This article is autogenerated using RSS feeds and has not been created or edited by OA JF.

    Click here for Source link (https://www.springeropen.com/)