Stanford Universal Dependencies on Python NLTK
问题 Is there any way I can get the Universal dependencies using python, or nltk?I can only produce the parse tree. Example: Input sentence: My dog also likes eating sausage. Output: Universal dependencies nmod:poss(dog-2, My-1) nsubj(likes-4, dog-2) advmod(likes-4, also-3) root(ROOT-0, likes-4) xcomp(likes-4, eating-5) dobj(eating-5, sausage-6) 回答1: Wordseer's stanford-corenlp-python fork is a good start as it works with the recent CoreNLP release (3.5.2). However it will give you raw output,