The anomaly logging algorithms developed for Arcadia (as shown on the website) have been completely abandoned and replaced with software that logs long term data at high or low speed sample rates. The reason is simple - as I began studying how data is analyzed and compared in peer reviewed studies, I realized that by ignoring baseline data, you don't have anything to compare your anomalies to. Automatically picking data that have substantially large variations leaves only graphs that are interesting to look at. But, there's no way call that scholarly research with no baseline data or statistics to back it up (but the graphs sure do look good on websites, don't they?).

     So, as stated earlier, all data is logged at a constant rate, normally around 10 hertz for logs of about an hour in length. This data can then be statistically analyzed, so things like standard deviation, Z scores, probability, etc. can be calculated for comparison between haunt and control sites. Not only that, but various forms of digital signal processing can be applied such as FIR... (view full text)

All materials Copyright 2002 - 2009 by JDF of GhostGadgets.com
All Rights Reserved   -   E-mail Webmaster