Ok so its like this...I LOVE Vegas, however i'm now working on alot of short films using hdv and sometimes digibeta an xdcam hd andthe highest image quality is a must. The thing is I know Vegas is only an 8-bit program where as final cut and even premiere is 10 bit and i'm really hoping someone who has had experience with the different programs will give me their feedback on any noticable changes between workin in an 8-bit and 10-bit colour range. I will be doing alot of colour correcting so the extra freedom in colour sounds as if i may have to switch from Vegas which is something that i would truly hate, i've been a user since version 3 and know it inside out. Infact to rant on just a bit more, i can't believe sony is still only an 8 bit application, doesnt sony realise that this may be the only big difference between the major applications, i was hoping they would sort this for version 7, but oh well i guess i may have to hope for version 8.
Also i'm a big believer on using your eyes rather than going by the scientific numbers and i have noticed a significant difference when rendering to pal dv on best settings in vegas and pal dv on final cut pro (a friend did this for me in uni). It looks as tho the final cut version is much 'cleaner' than the vegas one, i also found this to be the case in premier but not as much so with final cut. I assume its to do with better codecs but i also tried myself by rendering an uncompressed avi version of the project in vegas and it still doesnt look as crisp as what final cut pro is out putting. Just to let u know it was a single scene of a family bbq, directly captured into each application via firewire and rendered straight out again. Anyone experience anything similar?
I just checked out the thread, very exciting news however it seems to me that the addition of workin well on a 64 bit platforms will only benefit the program in speed and general perfromance. Unless im mistaken it will still leave me with my 'issues' of no 10-bit processing.