How much time does a program take to access RAM?

北城以北 提交于 2021-02-10 22:22:41

问题


I've seen explanations for why RAM is accessed in constant time (O(1)) and why it's accessed in logarithmic time (O(n)). Frankly neither makes much sense to me; what is n in the big-O notation and how does it make sense to measure the speed at which a physical device operates using big-O? I understand an argument for RAM being accessed in linear time is if you have an array a then the kth element would be at address a+k*size_of_type (point is address can be easily calculated). If you know the address of where you want to load or store from, wouldn't that be a constant amount of time in the sense it will always take the same no matter the location? Someone told me looking up something in RAM (like an element in an array) take longer than O(n) because it needs to find the right page. This is wrong as paging pertains to the hard disk, not RAM.


回答1:


I think it is a nanoseconds value, which is way faster than accessing the disk (5 to 80 ms)



来源:https://stackoverflow.com/questions/34312820/how-much-time-does-a-program-take-to-access-ram

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!