Transformer-based Transform Coding [ICLR 2022] in CompressAI + Pytorch Lightning #249
              
                Unanswered
              
          
                  
                    
                      ali-zafari
                    
                  
                
                  asked this question in
                Show and tell
              
            Replies: 0 comments
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone!
I have integrated the four models described in the recent ICLR paper Transformer-based Transform Coding into the CompressAI framework. You can find my implementation here:
github.com/ali-zafari/TBTC.
I tried to follow the code structure of CompressAI to make it easily accessible for anyone familiar with this great library in PyTorch. This TensorFlow implementation (SwinT-ChARM) is used as reference.
Models are defined in TBTC/compressai/models/qualcomm.py:
To do the training, I wrapped CompressAI-based models in Lightning module to make the multi-gpu training and logging/checkpointing easier. You can also download a sample checkpoint for each of the models to verify their performance with the results reported in the paper.
Hope it would be useful.
Beta Was this translation helpful? Give feedback.
All reactions