semantic-analysis

Does the Stanford NLP Parser have methods for semantic role labelling?

纵然是瞬间 提交于 2019-12-10 10:13:08
问题 I'm trying to find the semantic labels of english sentences. I am using the Stanford NLP parser. Does it have methods for this? I was going through the documentation but the closest thing I could find was: CoreAnnotations.SemanticWordAnnotation CoreAnnotations.SemanticTagAnnotation 回答1: No, we currently don't have a semantic role labeling (SRL) system in CoreNLP. Unless you already have a system that explicitly requires semantic role labels, I would recommend taking a look at the Universal

Using WordNet to determine semantic similarity between two texts?

孤者浪人 提交于 2019-12-09 13:36:53
问题 How can you determine the semantic similarity between two texts in python using WordNet? The obvious preproccessing would be removing stop words and stemming, but then what? The only way I can think of would be to calculate the WordNet path distance between each word in the two texts. This is standard for unigrams. But these are large (400 word) texts, that are natural language documents, with words that are not in any particular order or structure (other than those imposed by English grammar

How to identify adjectives or adverbs?

蓝咒 提交于 2019-12-08 09:17:50
问题 I am quite novice to NLP....Is there any API or a way in which i could identify verb or adjective or adverbs from a sentence? I need it in a project? 回答1: You will need a Part-of-speech Tagger (POSTagger). This identifies the role of every word in the sentence. Wikipedia has an excellent list of NLP toolkits, and they will almost all have POSTaggers. If your material is normal written English the POSTaggers will do well. If it's very colloquial (e.g. on Text Messages) or very unusual (e.g

semantic type checking analysis in bison

岁酱吖の 提交于 2019-12-06 07:18:14
问题 I've been trying to find examples everywhere but it's been in vain. I am trying to write a basic Ruby interpreter. For this, I wrote a flex lexical file, containing token recognition sentences, and a grammar file. I wish for my grammar to contain semantic type checking. My grammar file contains, for example: arg : arg '+' arg This should be a valid rule for integers and floats. According to what I've read, I can specify type for a non terminal such as arg, like so: %type <intval> arg where

Does the Stanford NLP Parser have methods for semantic role labelling?

╄→尐↘猪︶ㄣ 提交于 2019-12-06 01:24:39
I'm trying to find the semantic labels of english sentences. I am using the Stanford NLP parser. Does it have methods for this? I was going through the documentation but the closest thing I could find was: CoreAnnotations.SemanticWordAnnotation CoreAnnotations.SemanticTagAnnotation No, we currently don't have a semantic role labeling (SRL) system in CoreNLP. Unless you already have a system that explicitly requires semantic role labels, I would recommend taking a look at the Universal Dependencies representation. Despite the fact that this representation is primarily a syntactic representation

What is the use of Brown Corpus in measuring Semantic Similarity based on WordNet

耗尽温柔 提交于 2019-12-05 17:37:45
I came across several methods for measuring semantic similarity that use the structure and hierarchy of WordNet, e.g. Jiang and Conrath measure (JNC), Resnik measure(RES), Lin measure (LIN) etc. The way they are measured using NLTK is: sim2=wn.jcn_similarity(entry1,entry2,brown_ic) sim3=entry1.res_similarity(entry2, brown_ic) sim4=entry1.lin_similarity(entry2,brown_ic) If WordNet is the basis of calculating semantic similarity, what is the use of Brown Corpus here? arturomp Take a look at the explanation at the NLTK howto for wordnet. Specifically, the *_ic notation is information content .

RDF and OWL workflow question

痞子三分冷 提交于 2019-12-04 15:32:03
问题 I have been looking at and playing with OWL through Protege and I would like to know if I understand the "workflow" and idea of it correctly (for building up a database from scratch: Generate an OWL ontology for your data using Protege or equivalent Export this schema to RDF Use the classes defined as some of the elements in a triplestore along with your target data Export your triplestore to RDF Use openRDF/sesame or Jena to load the defined data and ontology Validate your RDF triplestore

semantic type checking analysis in bison

↘锁芯ラ 提交于 2019-12-04 12:34:18
I've been trying to find examples everywhere but it's been in vain. I am trying to write a basic Ruby interpreter. For this, I wrote a flex lexical file, containing token recognition sentences, and a grammar file. I wish for my grammar to contain semantic type checking. My grammar file contains, for example: arg : arg '+' arg This should be a valid rule for integers and floats. According to what I've read, I can specify type for a non terminal such as arg, like so: %type <intval> arg where "intval" is in the type union and corresponds to the int C type. But this is only for integers, I am not

Using WordNet to determine semantic similarity between two texts?

早过忘川 提交于 2019-12-03 17:24:09
How can you determine the semantic similarity between two texts in python using WordNet? The obvious preproccessing would be removing stop words and stemming, but then what? The only way I can think of would be to calculate the WordNet path distance between each word in the two texts. This is standard for unigrams. But these are large (400 word) texts, that are natural language documents, with words that are not in any particular order or structure (other than those imposed by English grammar). So, which words would you compare between texts? How would you do this in python? One thing that you

RDF and OWL workflow question

纵然是瞬间 提交于 2019-12-03 08:51:02
I have been looking at and playing with OWL through Protege and I would like to know if I understand the "workflow" and idea of it correctly (for building up a database from scratch: Generate an OWL ontology for your data using Protege or equivalent Export this schema to RDF Use the classes defined as some of the elements in a triplestore along with your target data Export your triplestore to RDF Use openRDF/sesame or Jena to load the defined data and ontology Validate your RDF triplestore against your OWL ontology to make sure everything is ok Use SPARQL to get data from your RDF triplestore