tensorpack.predict package

class tensorpack.predict.PredictorBase[source]

Bases: object

Base class for all predictors.


bool – whether the call will also return (inputs, outputs) or just outpus


Call the predictor on some inputs.

If len(args) == 1, assume args[0] is a datapoint (a list). otherwise, assume args is a datapoinnt


When you have a predictor which takes a datapoint [e1, e2], you can call it in two ways:

predictor(e1, e2)
predictor([e1, e2])
class tensorpack.predict.AsyncPredictorBase[source]

Bases: tensorpack.predict.base.PredictorBase

Base class for all async predictors.

put_task(dp, callback=None)[source]
  • dp (list) – A datapoint as inputs. It could be either batched or not batched depending on the predictor implementation).

  • callback – a thread-safe callback to get called with either outputs or (inputs, outputs).


concurrent.futures.Future – a Future of results


Start workers

class tensorpack.predict.OnlinePredictor(input_tensors, output_tensors, return_input=False, sess=None)[source]

Bases: tensorpack.predict.base.PredictorBase

A predictor which directly use an existing session and given tensors.

__init__(input_tensors, output_tensors, return_input=False, sess=None)[source]
  • input_tensors (list) – list of names.

  • output_tensors (list) – list of names.

  • return_input (bool) – same as PredictorBase.return_input.

  • sess (tf.Session) – the session this predictor runs in. If None, will use the default session at the first call.

class tensorpack.predict.OfflinePredictor(config)[source]

Bases: tensorpack.predict.base.OnlinePredictor

A predictor built from a given config. A sinlge-tower model will be built without any prefix.

Parameters:config (PredictConfig) – the config to use.
class tensorpack.predict.MultiProcessPredictWorker(idx, config)[source]

Bases: multiprocessing.process.Process

Base class for predict worker that runs offline in multiprocess

__init__(idx, config)[source]
  • idx (int) – index of the worker. the 0th worker will print log.

  • config (PredictConfig) – the config to use.

class tensorpack.predict.MultiProcessQueuePredictWorker(idx, inqueue, outqueue, config)[source]

Bases: tensorpack.predict.concurrency.MultiProcessPredictWorker

An offline predictor worker that takes input and produces output by queue. Each process will exit when they see DIE.

__init__(idx, inqueue, outqueue, config)[source]
class tensorpack.predict.MultiThreadAsyncPredictor(predictors, batch_size=5)[source]

Bases: tensorpack.predict.base.AsyncPredictorBase

An multithread online async predictor which runs a list of OnlinePredictor. It would do an extra batching internally.

__init__(predictors, batch_size=5)[source]
  • predictors (list) – a list of OnlinePredictor avaiable to use.

  • batch_size (int) – the maximum of an internal batch.

put_task(dp, callback=None)[source]

Same as in AsyncPredictorBase.put_task().

class tensorpack.predict.PredictConfig(model, session_creator=None, session_init=None, input_names=None, output_names=None, return_input=False, create_graph=True, session_config=None)[source]

Bases: object

__init__(model, session_creator=None, session_init=None, input_names=None, output_names=None, return_input=False, create_graph=True, session_config=None)[source]
  • model (ModelDesc) – the model to use.

  • session_creator (tf.train.SessionCreator) – how to create the session. Defaults to sesscreate.NewSessionCreator().

  • session_init (SessionInit) – how to initialize variables of the session. Defaults to do nothing.

  • input_names (list) – a list of input tensor names. Defaults to all inputs of the model.

  • output_names (list) – a list of names of the output tensors to predict, the tensors can be any computable tensor in the graph.

  • return_input (bool) – same as in PredictorBase.return_input.

  • create_graph (bool) – create a new graph, or use the default graph when then predictor is first initialized.

class tensorpack.predict.DatasetPredictorBase(config, dataset)[source]

Bases: object

Base class for dataset predictors. These are predictors which run over a DataFlow.

__init__(config, dataset)[source]
Returns:list – all outputs for all datapoints in the DataFlow.
Yields:output for each datapoint in the DataFlow.
class tensorpack.predict.SimpleDatasetPredictor(config, dataset)[source]

Bases: tensorpack.predict.dataset.DatasetPredictorBase

Simply create one predictor and run it on the DataFlow.

class tensorpack.predict.MultiProcessDatasetPredictor(config, dataset, nr_proc, use_gpu=True, ordered=True)[source]

Bases: tensorpack.predict.dataset.DatasetPredictorBase

Run prediction in multiprocesses, on either CPU or GPU. Each process fetch datapoints as tasks and run predictions independently.

__init__(config, dataset, nr_proc, use_gpu=True, ordered=True)[source]
  • config – same as in DatasetPredictorBase.

  • dataset – same as in DatasetPredictorBase.

  • nr_proc (int) – number of processes to use

  • use_gpu (bool) – use GPU or CPU. If GPU, then nr_proc cannot be more than what’s in CUDA_VISIBLE_DEVICES.

  • ordered (bool) – produce outputs in the original order of the datapoints. This will be a bit slower. Otherwise, get_result() will produce outputs in any order.

class tensorpack.predict.MultiTowerOfflinePredictor(config, towers)[source]

Bases: tensorpack.predict.base.OnlinePredictor

A multi-tower multi-GPU predictor.

__init__(config, towers)[source]
  • config (PredictConfig) – the config to use.

  • towers – a list of relative GPU id.

Returns:OnlinePredictor – the nth predictor on the nth tower.
Returns:list[OnlinePredictor] – a list of predictor
class tensorpack.predict.DataParallelOfflinePredictor(config, towers)[source]

Bases: tensorpack.predict.base.OnlinePredictor

A data-parallel predictor. Note that it doesn’t split/concat inputs/outputs automatically. Instead, its inputs are: [input[0] in tower[0], input[1] in tower[0], ..., input[0] in tower[1], input[1] in tower[1], ...] Similar for the outputs.

__init__(config, towers)[source]
  • config (PredictConfig) – the config to use.

  • towers – a list of relative GPU id.