Fix broken imports in 4 existing contrib models#114
Open
jimburtoft wants to merge 1 commit intoaws-neuron:mainfrom
Open
Fix broken imports in 4 existing contrib models#114jimburtoft wants to merge 1 commit intoaws-neuron:mainfrom
jimburtoft wants to merge 1 commit intoaws-neuron:mainfrom
Conversation
Mixtral-8x7B: __init__.py imported from .mixtral_model but file is modeling_mixtral.py OLMo-2-1124-7B: __init__.py imported from neuronx_port.modeling_olmo2 (nonexistent package) Qwen3-VL-8B-Thinking: __init__.py imported from neuronx_port.modeling_qwen3_vl (nonexistent package) biogpt: test imported NeuronBioGPTForCausalLM/BioGPTInferenceConfig (wrong case) -- actual classes use title case (BioGpt) All four models crash on import before this fix. Changes are mechanical -- wrong module names and case mismatches only.
Contributor
Author
Validation ResultsTested on trn2.3xlarge spot instance (sa-east-1, SDK 2.28, LNC=2). Import Validation (all 4 models)All four models successfully import after the fix -- the classes are found and instantiated correctly:
Compilation ValidationI attempted end-to-end compile+generate on biogpt and Qwen3-VL-8B. Both hit pre-existing issues unrelated to the import fixes:
These contribs have deeper issues beyond the import paths. The import fixes in this PR are correct and verified -- they resolve the |
|
Do you have the github issue link for the pre-existing bugs ? |
jaharsh-aws
approved these changes
Apr 13, 2026
yidezou
approved these changes
Apr 13, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
These are fixes to existing contrib models, not new contributions.
Four contrib models crash on import due to wrong module names or case mismatches in their
__init__.pyor test files. All fixes are mechanical -- no logic changes.Fixes
src/__init__.py.mixtral_modelbut file ismodeling_mixtral.pyfrom .modeling_mixtral import ...src/__init__.pyneuronx_port.modeling_olmo2--neuronx_portpackage doesn't existfrom .modeling_olmo2 import ...src/__init__.pyneuronx_port.modeling_qwen3_vl-- same issuefrom .modeling_qwen3_vl import ...test/integration/test_model.pyNeuronBioGPTForCausalLM/BioGPTInferenceConfigbut actual class names use title case:NeuronBioGptForCausalLM/BioGptInferenceConfigNot included
helium-1-2b also has broken imports (
helium_configandhelium_modelmodules don't exist), but this is a deeper issue -- theHeliumInferenceConfigclass is referenced throughoutmodeling_helium.pybut never defined anywhere. That model needs its config class written, not just an import path fix.Files Changed
contrib/models/Mixtral-8x7B-Instruct-v0.1/src/__init__.pycontrib/models/OLMo-2-1124-7B/src/__init__.pycontrib/models/Qwen3-VL-8B-Thinking/src/__init__.pycontrib/models/biogpt/test/integration/test_model.pyTesting
These are import-path fixes only -- no model logic is changed. Each fix corrects a
ModuleNotFoundErrororImportErrorthat prevents the contrib from being used at all.