solr

Relevancy boosting very slow in Solr

最后都变了- 提交于 2020-01-05 09:34:06
问题 I have a Solr index with about 2.5M items in it and I am trying to use an ExternalFileField to boost relevancy. Unfortunately, it's VERY slow when I try to do this, despite it being a beefy machine and Solr having lots of memory available. In the external file I have contents like: 747501=3.8294805903e-07 747500=3.8294805903e-07 1718770=4.03292174724e-07 1534562=3.8294805903e-07 1956010=3.8294805903e-07 747509=3.8294805903e-07 747508=3.8294805903e-07 1718772=3.8294805903e-07 1391385=3

Solr: Facet one field with two outputs

安稳与你 提交于 2020-01-05 08:51:53
问题 I'm using Solr for indexing products and organising them into several categories. Each document has a taxon_names multi value field, where the categories are stored as human readable strings for a product. Now I want to fetch all the categories from Solr and display them with clickable links to the user, without hitting the database again . At index time, I get the permalinks for every category from the MySQL database, which is stored as a multi value field taxon_permalinks . For generating

Can Solr replicate external file fields file?

故事扮演 提交于 2020-01-05 08:18:32
问题 I have several external file fields that get reloaded every hour. My solrconfig.xml has <dataDir>${solr.data.basedir}/${solr.core.name}</dataDir> and the external file field files are under this dir like external_* . With Solr replication, I can only replicate the index and the config files. Is the only option to separately reload these files on the slaves and then call reloadCache on all the slaves individually? Or can Solr replicate the external file fields files? 回答1: Solr is able to

Securing Update and Delete queries in Solr

拈花ヽ惹草 提交于 2020-01-05 08:13:11
问题 I have a website that displays product information using Solr, and it is managed via the URL. I am curious as to how I would go about preventing regular users from updating or deleting my Apache Solr documents via the URL. I want to get it so only admins can submit these queries. I would assume that there is a way to have a username and password verify that an arbitrary user is an admin, thus allowing for the URL request to modify data. This is useful, but the problem is that I don't want

How to get DocValue by document ID in Lucene 7+?

送分小仙女□ 提交于 2020-01-05 08:01:17
问题 I'm adding a DocValue to a document with doc.add(new BinaryDocValuesField("foo",new BytesRef("bar"))); To retrieve that value for a specific document with ID docId , I call DocValues.getBinary(reader,"foo").get(docId).utf8ToString(); The get function in BinaryDocValues is supported up to Lucene 6.6, but for Lucene 7.0 and up it does not seem to be available anymore. So, how do I get the DocValue by document ID in Lucene 7+ (without having to iterate over BinaryDocValues / DocIdSetIterator ,

How to get DocValue by document ID in Lucene 7+?

寵の児 提交于 2020-01-05 08:01:12
问题 I'm adding a DocValue to a document with doc.add(new BinaryDocValuesField("foo",new BytesRef("bar"))); To retrieve that value for a specific document with ID docId , I call DocValues.getBinary(reader,"foo").get(docId).utf8ToString(); The get function in BinaryDocValues is supported up to Lucene 6.6, but for Lucene 7.0 and up it does not seem to be available anymore. So, how do I get the DocValue by document ID in Lucene 7+ (without having to iterate over BinaryDocValues / DocIdSetIterator ,

Solr - Missing Required Field

血红的双手。 提交于 2020-01-05 07:44:11
问题 Solr is reporting that it is missing a required field (documentId) but the field and value are being passed to Solr. From the schema: <fields> <field name="id" type="string" indexed="true" stored="true" required="true" /> <field name="documentId" type="string" indexed="true" stored="true" required="true" /> </fields> According to the Solr log, the documentId is being passed in: org.apache.solr.core.SolrCore execute INFO: [] webapp=/solr path=/update/extract params={waitSearcher=true&commit

NoSuchMethodError with slfj4

走远了吗. 提交于 2020-01-05 07:24:52
问题 I'm trying to use Solr with slfj4 and logback, and when I shut down Solr with CTRL+C, I get this error: java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V This doesn't happen during compile time. I've checked the method signature for org.slf4j.spi.LocationAwareLogger.log in version 1.6.4, and it seems to be correct: public void log(Marker marker, String fqcn, int level, String

NoSuchMethodError with slfj4

天涯浪子 提交于 2020-01-05 07:23:26
问题 I'm trying to use Solr with slfj4 and logback, and when I shut down Solr with CTRL+C, I get this error: java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V This doesn't happen during compile time. I've checked the method signature for org.slf4j.spi.LocationAwareLogger.log in version 1.6.4, and it seems to be correct: public void log(Marker marker, String fqcn, int level, String

Getting only 10 rows in Solr Cassandra search

大城市里の小女人 提交于 2020-01-05 07:17:09
问题 I am working on Datastax Cassandra with Apache Solr for multiple partial search. Issue is, everytime I am getting only 10 rows even once I am doing count(*) query, I am able to check there are 1300 rows belong to particular query. nandan@cqlsh:testo> select id from empo where solr_query = 'isd:9*'; id -------------------------------------- 5ee5fca6-6f48-11e6-8b77-86f30ca893d3 27e3e3bc-6f48-11e6-8b77-86f30ca893d3 f3156e76-6f47-11e6-8b77-86f30ca893d3 f315ac74-6f47-11e6-8b77-86f30ca893d3