Skip to content

Unable to load the trained model for inference #5

@sreeramjoopudi

Description

@sreeramjoopudi

I have been using allennlp for the last one year and I have successfully trained & ran-inference on these models through config files. Recently, I wanted to train & load models without the usage of config files. I was successfully able to train a model by using allennlp as a library. However, when I tried to load this model in a separate process/separate python script (for inference), I ran into an issue of missing config.json file. The load_archive method of allennlp.models.archival is throwing a missing config.json file error when I point to the output of trained model directory. Can you tell us

  • if this is expected and the way to overcome this is to create a config.json on our own (I believe, it should be possible to create my own config.json if needed for running inference) (or)

  • is there any other way in which I should load the trained model

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions