From 0224731f9e98026d8507bd3f75da1c04261753a8 Mon Sep 17 00:00:00 2001
From: Yifan Zhao <yifanz16@illinois.edu>
Date: Thu, 8 Apr 2021 15:57:55 -0500
Subject: [PATCH] Use 'kept' configs consistently throughout

---
 doc/getting_started.rst | 20 +++++++++++---------
 1 file changed, 11 insertions(+), 9 deletions(-)

diff --git a/doc/getting_started.rst b/doc/getting_started.rst
index aefdf71..cc8b7cb 100644
--- a/doc/getting_started.rst
+++ b/doc/getting_started.rst
@@ -125,15 +125,17 @@ We will be using the term QoS throughout the tutorials.
 :py:meth:`tuner.tune <predtuner.modeledapp.ApproxModeledTuner.tune>`
 is the main method for running a tuning session.
 It accepts a few parameters which controls the behavior of tuning.
-`max_iter` defines the number of iterations to use in autotuning.
-Within 1000 iterations, PredTuner should find about 200 valid configurations.
-PredTuner will also automatically mark out `Pareto-optimal
-<https://en.wikipedia.org/wiki/Pareto_efficiency>`_
-configurations.
-These are called "best" configurations (`tuner.best_configs`),
-in contrast to "valid" configurations which are the configurations that satisfy our accuracy requirements
-(`tuner.kept_configs`).
-`take_best_n` allows taking some extra close-optimal configurations in addition to Pareto-optimal ones.
+
+* `qos_keep_threshold` decides the QoS threshold above which the found configuration is kept.
+  These are called the "kept" configurations and are accessible from `tuner.kept_configs`.
+
+* `max_iter` defines the number of iterations to use in autotuning.
+  Within 1000 iterations, PredTuner should be about to find about 200 "kept" configurations.
+
+* PredTuner will also automatically mark out
+  `Pareto-optimal <https://en.wikipedia.org/wiki/Pareto_efficiency>`_ configurations.
+  These are called "best" configurations (`tuner.best_configs`)
+  `take_best_n` allows taking some extra close-optimal configurations in addition to Pareto-optimal ones.
 
 1000 iterations is for demonstration; in practice,
 at least 10000 iterations are necessary on VGG16-sized models to converge to a set of good configurations.
-- 
GitLab