We included in this webpage the original data set used and the results obtained by applying track before detect. bellow is the description of the dataset. N.B:Data are ordered, timestamped, single-valued metrics. All data files contain anomalies, unless otherwise noted.
realAWSCloudwatch/
AWS server metrics as collected by the AmazonCloudwatch service. Example metrics include CPU Utilization, Network Bytes In, and Disk Read Bytes.
realAdExchange/
Online advertisement clicking rates, where the metrics are cost-per-click (CPC) and cost per thousand impressions (CPM). One of the files is normal, without anomalies.
realKnownCause/
This is data for which we know the anomaly causes; no hand labeling.
realTraffic/
Real time traffic data from the Twin Cities Metro area in Minnesota, collected by the Minnesota Department of Transportation. Included metrics include occupancy, speed, and travel time from specific sensors.
realTweets/
A collection of Twitter mentions of large publicly-traded companies such as Google and IBM. The metric value represents the number of mentions for a given ticker symbol every 5 minutes.
artificialWithAnomaly/
Artificially-generated data with varying types of anomalies.
To facilitate the interpretation of the results after using track before detect, we used three types of figures (scatter, density_heatmap and line);A CSV file is generated for each analyzed using a case where you can find the calculated anomaly score
below is a mix of the results from existing methods like Gaussian Mode, Regression Model, KNN, LOF, STREAM, INFLO, Autoencoders