computer-science

Where can I learn more about “ant colony” optimizations?

半城伤御伤魂 提交于 2019-12-03 00:23:30
I've been reading things here and there for a while now about using an "ant colony" model as a heuristic approach to optimizing various types of algorithms. However, I have yet to find an article or book that discusses ant colony optimizations in an introductory manner, or even in a lot of detail. Can anyone point me at some resources where I can learn more about this idea? On the off chance that you know German (yes, sorry …), a friend and I have written an introduction with code about this subject which I myself find quite passable. The text and code uses the example of TSP to introduce the

What is the computer science definition of entropy?

旧城冷巷雨未停 提交于 2019-12-03 00:19:08
问题 I've recently started a course on data compression at my university. However, I find the use of the term "entropy" as it applies to computer science rather ambiguous. As far as I can tell, it roughly translates to the "randomness" of a system or structure. What is the proper definition of computer science "entropy"? 回答1: Entropy can mean different things: Computing In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses

Garbage Collection

对着背影说爱祢 提交于 2019-12-03 00:13:50
I am not able to understand few things on the Garbage collection. Firstly, how is data allocated space ? i.e. on stack or heap( As per my knowledge, all static or global variables are assigned space on stack and local variables are assigned space on heap). Second, GC runs on data on stacks or heaps ? i.e a GC algorithm like Mark/Sweep would refer to data on stack as root set right? And then map all the reachable variables on heap by checking which variables on heap refer to the root set. What if a program does not have a global variable? How does the algorithm work then? Regards, darkie It

Eventual consistency in plain English

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-03 00:08:52
问题 I often hear about eventual consistency in different speeches about NoSQL, data grids etc. It seems that definition of eventual consistency varies in many sources (and maybe even depends on a concrete data storage). Can anyone give a simple explanation what Eventual Consistency is in general terms, not related to any concrete data storage? 回答1: Eventual consistency: I watch the weather report and learn that it's going to rain tomorrow. I tell you that it's going to rain tomorrow. Your

What Computer Science concepts should I know? [closed]

a 夏天 提交于 2019-12-02 23:58:37
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . What concepts in Computer Science do you think have made you a better programmer? My degree was in Mechanical Engineering so having

Is it possible to evaluate lambda calculus terms efficiently?

痞子三分冷 提交于 2019-12-02 23:42:04
I've been writing a lot of programs in the lambda calculus recently and I wish I could run some of them in realtime. Yet, as much as the trending functional paradigm is based on the lambda calculus and the rule of B-reductions, I couldn't find a single evaluator that isn't a toy, not meant for efficiency. Functional languages are supposed to be fast, but those I know don't actually provide access to normal forms (see Haskell's lazy evaluator, Scheme's closures and so on), so don't work as LC evaluators. That makes me wonder: is it just impossible to evaluate lambda calculus terms efficiently,

Find the words in a long stream of characters. Auto-tokenize

五迷三道 提交于 2019-12-02 23:28:19
How would you find the correct words in a long stream of characters? Input : "The revised report onthesyntactictheoriesofsequentialcontrolandstate" Google's Output: "The revised report on syntactic theories sequential controlandstate" (which is close enough considering the time that they produced the output) How do you think Google does it? How would you increase the accuracy? I would try a recursive algorithm like this: Try inserting a space at each position. If the left part is a word, then recur on the right part. Count the number of valid words / number of total words in all the final

Confused between Temporal and Spatial locality in real life code

ⅰ亾dé卋堺 提交于 2019-12-02 23:17:29
I was reading this question , I wanted to ask more about the code that he showed i.e for(i = 0; i < 20; i++) for(j = 0; j < 10; j++) a[i] = a[i]*j; The questions are, I understand temporal locality, I think that references to i and j should be temporal locality. Am I right? I also understand spatial locality, as the question I linked answers that references to a[i] should be spatial locality. Am I right? The person said, "The inner loop will call same memory address when accessing a[i] ten times so that's an example for temporal locality I guess. But is there spatial locality also in the above

How does semaphore work?

巧了我就是萌 提交于 2019-12-02 23:12:35
Can the semaphore be lower than 0? I mean, say I have a semaphore with N=3 and I call "down" 4 times, then N will remain 0 but one process will be blocked? And same the other way, if in the beginning I call up, can N be higher than 3? Because as I see it, if N can be higher than 3 if in the beginning I call up couple of times, then later on I could call down more times than I can, thus putting more processes in the critical section then the semaphore allows me. If someone would clarify it a bit for me I will much appreciate. Greg Calling down when it's 0 should not work. Calling up when it's 3

In Natural language processing, what is the purpose of chunking?

瘦欲@ 提交于 2019-12-02 22:34:22
In Natural language processing, what is the purpose of chunking? Chunking is also called shallow parsing and it's basically the identification of parts of speech and short phrases (like noun phrases). Part of speech tagging tells you whether words are nouns, verbs, adjectives, etc, but it doesn't give you any clue about the structure of the sentence or phrases in the sentence. Sometimes it's useful to have more information than just the parts of speech of words, but you don't need the full parse tree that you would get from parsing. An example of when chunking might be preferable is Named