* avoid creating attention masks when there is no padding
* make fix-copies
* torch compile tests
* set all ones mask to none
* fix positional encoding from becoming > 4096
* fix from review
* slice freqs_cis to match the input sequence length
* keep only attenton masking change
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>