I'm not sure why Adobe didn't include an option to burn your BluRay disc files to a folder on your hard drive, as they did DVD files. But it would sure be helpful, since the situation you describe is common. Multiple burns of the same project sometimes do result in corruptions
Maybe someone can offer a solution with the program. But I know many people use a great little program called DVD Architect Studio 5 to supplement Premiere Elements. The program, available as a $39 download from Sony, is a dedicated disc authoring program -- so you edit in Premiere Elements and then hand off the file to DVD Architect.
Among the many great benefits of this program is that it allows you to output a disc image of your BluRay from which you can relatively quickly output as many copies as you'd like without re-transcoding the whole project every time.
Another method is to buy a 'virtual burner' that makes the software think it is seeing a physical drive. In fact the output is redirected to your hard disk.
Insanity is hereditary, you get it from your children
This has been a Feature Request, that I have filed for several versions. It seems that it would be fairly easy, since the authoring function in PrE is based on the same Sonic AuthorCore modules, as Adobe Encore (ships with PrPro), and Encore has Burn to Folder for both DVD and BD, plus also Burn Image (ISO file). Then, one can use the free ImgBurn, to even burn multiple BD's (or DVD's) to multiple burners on the computer (by opening multiple instances of ImgBurn).
Oh well, it must not be as easy as I imagine, and maybe Adobe has to license every module from Sonic, so they decided not to license this one?
As PE9 can not save a AVCHD file to hard disk so that can be used by Sony Architect, how does one use that Sony program to make a Blue Ray disk if it doesnt have a file to use?
It seems to do a good job saving 20 minutes of an AVCHD project to a DVD.
To output video from Premiere Elements so that DVD Architect can use it to create a BluRay, use Share/Computer/MPEG and select the preset for a 1440x1080 30i. (Those are non-square TV pixels, so it's the same resolution as 1920x1080.) Your output will be a m2t file.
This video format will require virtually no transcoding when DVD Architect converts it to BluRay files. (DVDs require a different source format, of course.)
More details are in my books, if you're interested.
I am curious as to how the final result can be 1920 when you select a 1440 preset? How does the non-square TV pixel thing affect this?
Non-square pixels are wider than they are tall, Ted. 33% wider than tall, to be precise.
So 1440 non-square pixels is exactly as wide as 1920 square pixels with no discernible difference in the number of pixels or resolution.
But with a 1920 pixel screen, the maximum horizontal resolution of a 1920 picture is surely 1920 / 2 = 960 lines. This is because the finest detail you can disply has to have alternate black and white dots according to the definition of resolution.
I would have though the the maximum horizontal resolution of a 1440 picture is 1440 / 2 = 720 lines
How come the horizontal resolution of the final Blue Ray is not correspondingly lower (and the picture softer) when 1440 is used ?
You've got your horizontals and verticals mixed up, Ted.
Hi-def video is 1440 lines.
'The 1920 and 1440 are the vertical rows of dots from side to side of the screen. There is no discernible difference between 1920 pixels across and 1440 non-square pixels across.
Thanks for the response. I will look into Architect Studio. Meanwhile, I restarted my computer and was able to burn three copies of my project to Blu-ray without issues.
>"There is no discernable difference between 1920 pixels across and 1440 non-square pixels across."
I don't think so-
All my Hi-def video is 1920 x 1080i.
My computer monitor is 1290 x 1200
All the new TV sets on sale over 24" (here in Australia) are 1920 x 1080.
Hi-def TV transmissions are 1920x1080i 25f (except our poor old Government run ABC that is ruled by ex POMs)
The only 1440 we get is in some lesser quality movie theatres and many TV programs from UK or old stuff from USA..
It is true that if you observe a picture on a LCD screen that is the same pixels as the original video, it can be better. All you have to do is to set your computer video card to a lower res than the monitor's native screen resolution and try to read fine text.
If you observe 1440 on a 1920 monitor it looks the equivalent of about 900 because you can never match up the pixels to get a sharp transition and you get finely jagged diagonal lines if you don't restrict the resolution.
If you have been evaluating the difference between 1440 and 1920 and not seen the noticable difference, I would suspect that you computer monitor native resolution must be 1440 and not 1920 (as a lot of supposedly hi-def computer monitors are).
The difference regarding square or rectangular pixels is to do with the aspect ratio. 1440 square pixels takes up the same width as 1920 rectangular pixels
To preserve the 16x9, a TV set with 1440 screen would have to have wider holes or else have a wider guard band between pixels (so less brightness),
You can't see any horizontal resolution greater than 1440 on a 1440 monitor.
As the two settings in question are both 1080 vertically, the vertical resolution is the same
However if the resultant file is limited to 1440 pixels horizontally, then it must be a lower horizontal resolution than 1920.
You can plainly see the difference on a 1920x1200 computer monitor and a Sony Bravia 40" LDV TV.
>"You've got your horizontals and verticals mixed up, Ted"
I am afraid not -
Horizontal resolution is the number of black AND white dots that can be resolved across the screen.
Vertical resolution is the number of black AND white dots that can be discerned from top to bottom.
Pixels has nothing directly to do with the definition of resolution. Pixels describes the number of holes in a screen or the size of discrete steps in the signal.
The more pixels, the higher the resolution.
An old NTSC 525 line TV set was said to have a vertical resolution of 252 and a half lines, that being the greatest number of black and white horizontal lines that can be displayed in a vertical direction allowing for 20 lines of vertical blanking, the half line cause by interlace.
By the same token the horizontal resolution was dependent on the bandwith of the video amplifier before the color subcarrier of 3.58Mhz giving a maximum of 350 lines horizontal resolution althoiugh many TVs were much lower.
Sorry typo error above -
"My computer monitor is 1290 x 1200" should have read "My computer monitor is 1920 x 1200"
Maybe it was dust on my monitor, or dirty glasses, or maybe "Wine-thirty" was declared too early, but I read your original as 1920 x 1200, as that was what I was expecting...