timing

requestAnimationFrame [now] vs performance.now() time discrepancy

烈酒焚心 提交于 2019-12-18 16:49:11
问题 Assumptions: rAF now time is calculated at the time the set of callbacks are all triggered. Therefore any blocking that happens before the first callback of that frame is called doesn't affect the rAF now and it's accurate--at least for that first callback. Any performance.now() measurements made before a rAF set is triggered should be earlier than rAF now . Test: Record before (a baseline time before anything happens). Set the next rAF. Compare rAF now and actual performance.now() to before

Fade elements in incrementally on window load

戏子无情 提交于 2019-12-18 13:47:08
问题 I'm looking to fade in divs with a certain class in code order, with each fade coming maybe 250ms after the last, giving the impression of a gradual page load. I'm this far for fading in everything at once... $(window).load(function(){ $('div.fade_this_please').fadeIn(4000); }); but I'm not sure where I'm going to cycle through each DIV and fade it in when the other is complete. Can someone point me in the right direction!? Any advice appreciated! 回答1: This fades all divs into view, each with

What is the best way to measure Client Side page load times?

ぐ巨炮叔叔 提交于 2019-12-18 10:16:23
问题 I'm looking to monitor the end user experience of our website and link that with timing information already logged on the server side. My assumption is that this will require javascript to to capture time stamps at the start of request (window.onbeforeunload) and at the end of loading (window.onload). Basically this - "Measuring Web application response time: Meet the client" Is there a better approach? What kind of performance penalty should I be expecting (order of magnitude)? How good are

How to realise long-term high-resolution timing on windows using C++?

心已入冬 提交于 2019-12-18 05:02:58
问题 I need to get exact timestamps every couple of ms (20, 30, 40ms) over a long period of time (a couple of hours). The function in which the timestamp is taken is invoked as a callback by a 3rd-party library. Using GetSystemTime() one can get the correct system timestamp but only with milliseconds accuracy, which is not precise enough for me. Using QueryPerformanceTimer() yields more accurate timestamps but is not synchronous to the system timestamp over a long period of time (see http://msdn

How do you run a long PHP script and keep sending updates to the browser via HTTP?

百般思念 提交于 2019-12-18 04:54:32
问题 How do you run a long PHP script and keep sending updates to the browser via HTTP? Something to do with output buffering but I don't know exactly how. 回答1: Output Buffering is thinking in the right direction, you start output buffering with ob_start() just like you would with sessions ( session_start ) somewhere in the top of your script, before any output is sent. Then, you can use ob_flush and flush to keep flushing the output. For example, if you are in a foreach loop and at the end of

Measuring time with a resolution of microseconds in C++?

穿精又带淫゛_ 提交于 2019-12-17 20:51:21
问题 I'm looking for a way to measure microsecs in C++/Windows. I read about the "clock" function, but it returns only milliseconds... Is there a way to do it? 回答1: Use QueryPerformanceCounter and QueryPerformanceFrequency for finest grain timing on Windows. MSDN article on code timing with these APIs here (sample code is in VB - sorry). 回答2: http://www.boost.org/doc/libs/1_45_0/doc/html/date_time/posix_time.html altough Get the UTC time using a sub second resolution clock. On Unix systems this is

How do I measure duration in seconds in a shell script?

大憨熊 提交于 2019-12-17 17:34:54
问题 I wish to find out how long an operation takes in a Linux shell script. How can I do this? 回答1: Using the time command, as others have suggested, is a good idea. Another option is to use the magic built-in variable $SECONDS, which contains the number of seconds since the script started executing. You can say: START_TIME=$SECONDS dosomething ELAPSED_TIME=$(($SECONDS - $START_TIME)) I think this is bash-specific, but since you're on Linux, I assume you're using bash. 回答2: Use the time command.

timeit versus timing decorator

…衆ロ難τιáo~ 提交于 2019-12-17 04:22:40
问题 I'm trying to time some code. First I used a timing decorator: #!/usr/bin/env python import time from itertools import izip from random import shuffle def timing_val(func): def wrapper(*arg, **kw): '''source: http://www.daniweb.com/code/snippet368.html''' t1 = time.time() res = func(*arg, **kw) t2 = time.time() return (t2 - t1), res, func.__name__ return wrapper @timing_val def time_izip(alist, n): i = iter(alist) return [x for x in izip(*[i] * n)] @timing_val def time_indexing(alist, n):

Javascript, setTimeout loops?

戏子无情 提交于 2019-12-17 03:36:27
问题 So I am working on a music program that requires multiple javascript elements to be in sync with another. I've been using setInterval which works really well initially however over time the elements gradually become out of sync which with a music program is bad. I've read online that setTimeout is more accurate, and you can have setTimeout loops somehow however I have not found a generic version that illustrates how this is possible. Could someone just show me a basic example of using

Measure Hadoop job time using JobControl

六眼飞鱼酱① 提交于 2019-12-14 03:04:15
问题 I used to launch my Hadoop job with the following long start = new Date().getTime(); boolean status = job.waitForCompletion(true); long end = new Date().getTime(); This way I could measure the time taken by the job once it ends directly in my code. Now I have to use the JobControl in order to express dependencies between my jobs: JobControl jobControl = new JobControl("MyJob"); jobControl.addJob(job1); jobControl.addJob(job2); job3.addDependingJob(job2); jobControl.addJob(job3); jobControl