Skip to content

Support for FIFOQueue #1099

Open
Open
@dsmilkov

Description

@dsmilkov

Currently we fail to convert a TF Model if the model has a FIFOQueue.

We should investigate which models use FIFOQueue and whether queues make sense during inference time, or only for training.

Two paths for graphs with FIFOQueue:

  1. Implement a FIFOQueue and make the model stateful so that executing an "enqueue" or "dequeue" node modifies the state of the graph. This solution is technically feasible, but challenging.

  2. In the meantime, another option is to treat this as a UX problem and provide information to the user during conversion time.
    2A) The simplest solution is to detect that a graph has a "dequeue" op, followed by a synchronous subgraph, and warn the user to consider feeding data right after that op (showing that op's name and shape) when calling model.execute().
    2A) Additionally, would be great to ask the user if they are ok with the converter dropping any nodes before the "dequeue" op, which will also make the graph smaller.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions