I am about to buy a laptop and I am discovering that most
have integrated videocards (ex: Intel GMA X3100 w/ 384mb
OR ATI Radeon X1250 w/ 512mb allocated ram );
I understand that these integrated videocards use the
machine's ram via 'sharing';
QUESTION: is this an issue for me? I author in director10
with a cast full of dozens of scripts, usually one .w3d file (
created in lightwave with somewhere between 5000-20000 points ), a
flash file that streams in flvs for animated textures;
Yes it does. The shared RAM isn't so much an issue. More RAM
will equate to better handling of texture maps on those 3D objects.
However, the Intel GMA graphics processor (aka GPU) are not
up to par compared to a ATI or NVidia GPU. When it comes to
rendering 3D objects you need fast floating point calculations and
polygon rendering/fills. That is what an ATI card or NVidia card in
a desktop or laptop will buy you. Both excel and will render
DirectX (the 3D driver) faster and better then the Intel GPU.
The downside is that they consume more power. You will seldom
find a ATI or NVidia card built into say a Tablet laptop computer.
Most Tablets tend to lean towards consuming less power for more
portability in the hand of the user and thus ship to the less power
consuming (and weak) Intel GPU.
Also, if you do alot of transparency in Director (and
especially Flash within Director), the Intel GPU will come to a