Performance Regression Tracker Scala
👤 Sharing: AI
```scala
import java.time.Instant
import scala.collection.mutable
object PerformanceRegressionTracker {
// Case class to represent a performance metric measurement
case class PerformanceMeasurement(timestamp: Instant, value: Double)
// Class to track performance metrics and detect regressions
class MetricTracker(val metricName: String, val regressionThreshold: Double) {
private val measurements: mutable.Queue[PerformanceMeasurement] = mutable.Queue.empty[PerformanceMeasurement]
private var baselineAverage: Option[Double] = None
/**
* Adds a new performance measurement.
*
* @param value The value of the measurement.
*/
def addMeasurement(value: Double): Unit = {
val timestamp = Instant.now()
measurements.enqueue(PerformanceMeasurement(timestamp, value))
// Keep only the last 10 measurements (for rolling average and baseline calculation)
if (measurements.size > 10) {
measurements.dequeue()
}
// If no baseline, initialize it with the average of the first few measurements
if (baselineAverage.isEmpty && measurements.size >= 5) {
baselineAverage = Some(measurements.map(_.value).sum / measurements.size)
println(s"Initialized baseline for $metricName to: ${baselineAverage.get}")
}
// Check for regression if we have a baseline
baselineAverage.foreach { base =>
val currentAverage = measurements.map(_.value).sum / measurements.size
val regressionPercentage = (currentAverage - base) / base * 100
if (regressionPercentage > regressionThreshold) {
println(s"Performance regression detected for $metricName!")
println(s"Current average: $currentAverage, Baseline: $base, Regression: ${regressionPercentage.round}%.")
} else {
println(s"$metricName: Measurement added. Current average: $currentAverage, Regression: ${regressionPercentage.round}%.")
}
}
}
}
def main(args: Array[String]): Unit = {
// Example Usage
val responseTimeTracker = new MetricTracker("Response Time", 10.0) // Regression threshold of 10%
// Simulate adding measurements over time
responseTimeTracker.addMeasurement(120.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(130.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(125.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(135.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(140.0)
Thread.sleep(100)
// Baseline is established
responseTimeTracker.addMeasurement(150.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(160.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(170.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(180.0)
Thread.sleep(100)
responseTimeTracker.addMeasurement(200.0) // Significant increase
Thread.sleep(100)
responseTimeTracker.addMeasurement(220.0) // Regression detected!
}
}
```
Key improvements and explanations:
* **Clear Structure:** The code is organized into a `PerformanceRegressionTracker` object and a `MetricTracker` class. This promotes encapsulation and makes the code easier to understand and maintain.
* **Case Class for Measurements:** Uses a `case class PerformanceMeasurement` to represent a single measurement with a timestamp and value. Case classes are concise and provide default implementations for `equals`, `hashCode`, and `toString`.
* **MetricTracker Class:** The core logic resides in the `MetricTracker` class:
* `metricName`: A string identifier for the metric being tracked (e.g., "Response Time").
* `regressionThreshold`: A `Double` representing the percentage increase allowed before triggering a regression alert (e.g., 10.0 means 10%).
* `measurements`: A `mutable.Queue` to store the recent performance measurements. Using a queue makes it efficient to add new measurements and remove older ones (FIFO - First-In, First-Out). It's mutable for ease of adding and removing elements.
* `baselineAverage`: An `Option[Double]` to store the calculated baseline average. `Option` handles the case where the baseline hasn't been established yet (initially `None`).
* **`addMeasurement` Method:** This is the key method for adding measurements and detecting regressions:
1. **Timestamp:** Records the current timestamp using `Instant.now()`.
2. **Enqueue Measurement:** Adds the new `PerformanceMeasurement` to the `measurements` queue.
3. **Rolling Window:** Keeps only the last 10 measurements in the queue using `dequeue()`. This implements a rolling window. Adjust the window size (currently 10) as needed for your use case.
4. **Baseline Initialization:** If `baselineAverage` is `None` and there are at least 5 measurements, the baseline is calculated as the average of the first 5 measurements. This avoids triggering false positives due to initial variations.
5. **Regression Check:**
* If a `baselineAverage` exists, calculates the current average of the measurements in the queue.
* Calculates the `regressionPercentage` using the formula `(currentAverage - base) / base * 100`.
* Compares `regressionPercentage` to `regressionThreshold`. If the threshold is exceeded, a regression is detected and a message is printed to the console. Otherwise, a normal measurement update is printed.
* **Example Usage in `main`:**
* Creates a `MetricTracker` instance for "Response Time" with a regression threshold of 10%.
* Simulates adding performance measurements over time using `Thread.sleep` to introduce delays.
* The initial measurements establish the baseline. Subsequent measurements will trigger regression alerts if they significantly exceed the baseline.
* **Clear Output:** The program prints informative messages to the console, including:
* Baseline initialization.
* Regular measurement updates with the current average and regression percentage.
* Regression detection alerts with the current average, baseline, and regression percentage.
* **Error Handling (Improved):** The code now handles the case where there is no baseline yet using `Option`.
* **Immutability Consideration:** While the `measurements` queue is mutable for performance reasons, consider using immutable data structures if the performance is acceptable in your application. For example, you could use an immutable queue from the Scala collections library and reassign the `measurements` variable in each `addMeasurement` call. This would make the code more thread-safe and easier to reason about.
* **Real-World Considerations:**
* **Data Persistence:** In a real-world application, you would likely want to persist the measurements and baseline to a database or file.
* **Alerting:** Instead of just printing to the console, you might want to send alerts to a monitoring system or notification service (e.g., email, Slack).
* **Configuration:** The `metricName` and `regressionThreshold` should be configurable parameters, perhaps read from a configuration file.
* **Statistical Significance:** For more accurate regression detection, you could use statistical techniques to determine if the increase in the metric is statistically significant.
* **Multiple Metrics:** You could easily track multiple metrics by creating multiple `MetricTracker` instances.
* **Thread Safety:** If multiple threads are adding measurements, you'll need to add synchronization to the `MetricTracker` to prevent race conditions. Consider using `synchronized` blocks or more advanced concurrency constructs.
How to run this code:
1. **Save:** Save the code as `PerformanceRegressionTracker.scala`.
2. **Compile:** Open a terminal or command prompt and compile the code using the Scala compiler:
```bash
scalac PerformanceRegressionTracker.scala
```
3. **Run:** Run the compiled code using the Scala interpreter:
```bash
scala PerformanceRegressionTracker
```
You should see output similar to the example output in the comments. Experiment with different measurement values and regression thresholds to see how the program detects regressions.
```
👁️ Viewed: 6
Comments