Jump to content

Need a FAST machine spec for VW


Recommended Posts

OK, I've got a good machine, but it keeps running out of memory -or so VW tells me. I have a PC = Intel Core 2 - 2.66ghz / 3 GB Ram / ATI Radeon x1300 256MB Video. Does VW have the capability of working on an x64 system so I can increase the ram, OR are there other specs I need to increase, OR (PLEASE NO!!) do I have to get a Mac? Any opinions would be welcome, thanks. I am using VW 2008 SP3 with Renderworks on a Windows XP Pro SP3 OS. I'd love to know if anyone is using a x64 system.

Is there a way to control what RAM or Virtual Memory VW uses? I have plenty of memory available, even when it runs out. Typically when VW gives me the memory errors, it is only using 1/3 of my actual ram, and usually none of my virtual... I have been using CachemanXP to monitor it.

Link to comment

What are your page file settings? Making a page file too large can be detrimental.

If you do not already know this, on your system, I would recommend setting page file max ***and*** min settings at 4.5GB (1.5x physical RAM). This does not mean that VW will use all of this as (once RAM/Page exceeds max process address space) you still have the same amount of memory (RAM and virtual) available to VW irrespective of RAM and pagefile.

As you have not mentioned that you are running out of virtal memory, I assume that you are running out of address space available to the process. Increasing pagefile will not give you more memory available to a single process assuming that pagefile is set correctly.

If this is the case, you need to use an OS that allows more memory to the process - ie 64 bit Windows.

Otherwise, let Windows manage what goes into RAM and VM itself as, if you use the rule of thumb settings above, its going to give a pretty balanced performance.

Link to comment

lanH

Currently I am at 2x my Ram, so 6GB of Virtual Memory. I'd be happy to try 1.5x instead and see if it helps any...

brudgers

I use a program called Diskeeper that actually defrags daily.

MCH

I have read on these forums that some mac users are having the "lack of memory" errors as well...

Russel

Typically Outlook and Acrobat, sometimes Autocad, Photoshop.. But I've tried shutting them all down, doesn't seem to make a difference...

Link to comment

Typically Outlook and Acrobat, sometimes Autocad, Photoshop.. But I've tried shutting them all down, doesn't seem to make a difference...

Thats exactly right, it shouldn't make a difference. It does indeed sound like you are running out of process address space rather than virtual memory.

I come from a (DEC/Compaq/HP) VMS/VAX/Alpha background which was largely developed by Dave Cutler, et al who then went on to develop NT at Microsoft that XP and Vista is based on. Much of the elegance of low level VMS ended up in NT, including the 32bit virtual memory management system.

I broadly make assumptions that virtual memory management in 32bit NT/XP/Vista work as per VAX/VMS. Also, when I worked with VMS, we moved to 64 bit Alpha before hitting the address space limits of a 32bit architecture so I am only seeing running out of process memory in a 32bit Windows environment or when our code had a bug in it ;) Sorry for the long post, just reminiscing.

Basically, 32 bit addressing allows up to 4GB of memory to be addressed. On VAX/VMS, this was split in to process (P0 and P1) and system (S0) address space, each 2GBin size. S0 was further subdivided and the top half reserved which may possibly be the roots of 'perceived' 3GB RAM limits.

What is important is, all this memory is virtual. In other words, in the (then) likely scenario that your machine did not max out the RAM, it did not matter as the storage was held on disc until it was needed. When an area of memory was needed, it was then paged in, in 512byte blocks, somewhere into RAM where it was physically worked on. It did not matter where in ram it was stored, the memory management hardware handled the virtual to physical address mapping.

A very complex set of algorithms (including predictive read ahead) then decided when these 512K blocks were paged out of RAM. This would then free up the RAM for use by another process. But it was possible that the block remained in RAM as VMS deemed that it would likely be used soon, ie a regular system process, or physical RAM was enough that all active processes could simultaneously reside in RAM that there was little need to page anything out. There were also writeback algorithms that decided when to write the contents of RAM back to disc, optimised for consecutive 512K blocks. All very complex.

Another important fact was that process space (p0 and P1) were process specific, so another process (thread/application etc) would have its own processes space and the fact that they referenced memory at the same location did not matter as the contents of that location was specific to each process. If processes needed to share memory and communicate, then it had to be done via inter process communication and shared global memory, which, whilst still virtual, was shared between processes. This made NT much more robust over Win95 etc as one process was unlikely to corrupt another.

The long and short of this is, whilst all processes share the same address space, with a few exceptions, they all have exclusive use of it. So you could have two memory hungry apps, say Vectorworks and Photoshop both wanting all the memory available to them - ie 2GB. They were both able to have this so long as there was enough virtual memory to hold both - determined by the page file which is shared by all apps. Just swapping between apps did not mean that all the memory had to be paged between RAM and disc as it was done on an as needed basis. So you may load up a nice large image in Photoshop, say 50MB, but this is small in real terms and only the bits actively being used would be in RAM, not all the bits of Photoshop that no one often uses. The large model in Vectorworks would likely remain in RAM.

The real issue is when Vectorworks wants to use more than its allowable process space provides, in which case, it will run out of memory, irrespective of the size of the page file.

And I believe this is what has happened here. The solution is, optimising the model, omitting any memory leaks or avoiding unnecessary memory used by the app, then looking at an OS that has a larger process space available to the application. However, just because an app can run in a larger address space, it does not mean that it can make full use of it as that would require it to internally use 64 bit data structures which in a 32 bit environment, may be less than optimal due to the additional size needed to store 64 bit words.

As I say, the above is all based on how VAX/VMS worked but it was the forerunner of the internals of Windows NT. And apologies for getting carried away, just using it as a bit of reminiscence.

Link to comment

You can reduce fragmentation issues of page file by setting min and max page file size to be the same. This way, once you have a contiguous page file, it will never grow and thus become fragmented.

Even better, create a partition just for page file. Avoid it being on system disc. Even better still, put page file on its own disc. Those old 9GB SCSI discs do have a use but the performance of a newer disc may be better even though its being used by other apps.

However, the underlying issue appears to be running out of memory available to the process. Running out of VM gives another error.

Link to comment

The memory leak is very real on the Mac side. I am unaware of the details on the Windows side of the world.

VW will not correctly release memory when it finishes a rendering. When it gets to about 1.5 GB to 2.0 Gb (it varies a lot) VW will either warn you that you are out of memory, mess up your rendering, leading you to save and restart, or just crash or hang.

RonMan

Link to comment

RonMan

That appears to be what VW is doing on my PC as well. I can watch it's memory usage, and after every hidden line render, the memory usage just grows and grows.. It does seem to "run out" at about 1.5GB, and I've seen that 1.5GB figure on some other posts as well - WHICH leads to my next question...

EVEN IF, I have a 64 bit machine with LOTS of ram, is VW capable of using any more than 1.5GB, or is that a limit in the program?

Link to comment

System RAM vs MMIO: the "where is my 4GB?" problem

If you're feeling cramped within 4GB, hold on tight because things are about to get worse. The first issue is a historical limitation in the Intel x86 architecture, related to "Memory Mapped I/O." Essentially, all of the device memory used by video cards or any other expansion cards is mapped on top of the 4GB addresses used by the system's RAM. This didn't used to be a problem before anyone wanted to actually use the entire 4GB address space for system RAM.

Note that this doesn't mean that MMIO "eats up" your RAM, it's just that the hardware maps that device-related memory over the top of physical memory, leaving fewer addresses available to the operating system to use for its system RAM. This problem is tied to 32-bit chipsets, which are independent from the CPU. There are 64-bit PCs with 32-bit chipsets. For example, Apple's 64-bit Core 2 Duo laptops prior to the second half of 2007 all used 32-bit addressing.

That means that while they can execute 64-bit code and handle 64-bit virtual memory, they still can't address more than 4GB of physical RAM, minus roughly 0.75 GB of MMIO, for a grant total of 3.2GB usable RAM. If you install a full 4GB, the portion in conflict with the MMIO will simply not be used. For PC users installing a high end video card with 1GB of VRAM, the additional MMIO becomes an even greater problem: their usable system RAM shrinks by down to around 2.3GB.

The great PC RAM swindle

With Intel's "Santa Rosa" platform, Apple's Core 2 Duo machines gained chipset support to internally handle 8GB of address space. This allows Santa Rosa Macs to shove MMIO up into the high end of the space and reclaim all of the addresses below the 4GB mark, making the full amount available to the system. No version of 32-bit Windows supports this, and conversely, there is no 32-bit version of Mac OS X Leopard, so the "where is my full 4GB?" issue is now a Windows-only problem going forward.

Prior to using the Santa Rosa platform, Apple sold its laptops as only supporting a maximum of 3GB RAM because of this. However, many Original Equipment Manufacturer PC assemblers represent their machines as supporting 4GB of RAM even though the operating system can't actually make any use of a big chunk of it. With hardware that only supports 32-bit addressing, no operating system can make use of the full 4GB. However, even with Santa Rosa-style hardware that can make use of the full 4GB, the mainstream 32-bit Windows Vista still won't use more than 3.2GB or less because it can't remap MMIO.

One developer we consulted about the issue noted, "consumers are being scammed by [PC] OEMs on a large scale. OEMs will encourage customers to upgrade a 2GB machine to 4GB, even though the usable RAM might be limited to 2.3GB. This is especially a problem on high-end gaming machines that have huge graphics cards as well as lots of RAM."

"Microsoft even changed the way the OS reports the amount of RAM available; rumor is, due to pressure from OEMs," the developer told us. "In Vista and prior, it reported usable RAM, while in SP1 they changed it to report installed RAM ignoring the fact that much of the RAM was unusable due to overlap with video memory." And so many PC users are installing 4GB of RAM in their PCs and thinking that it is being used by the system, when in fact it is no more beneficial than if the RAM were simply poked halfway into the CD slot.

For example, Dell's top of the line $5799 Alienware gamer PC comes standard with a 1GB video card, 2GB of RAM, and 32-bit Windows Home Premium. That means the system can only possibly use 2.3GB of RAM, but Dell "recommends" users spend $250 (or $8 per month with financing) to buy a 4GB upgrade (below) that will offer them little more than bragging rights, as the 1GB video card and the roughly 750MB of other MMIO will make the extra 2GB unusable. Dell says "Upgrading your memory allows you to take full advantage of system capabilities as well as increasing system efficiency," but that's simply not true on this page.

Link to comment

If you do not already know this, on your system, I would recommend setting page file max ***and*** min settings at 4.5GB (1.5x physical RAM). This does not mean that VW will use all of this as (once RAM/Page exceeds max process address space) you still have the same amount of memory (RAM and virtual) available to VW irrespective of RAM and pagefile.

As you have not mentioned that you are running out of virtal memory, I assume that you are running out of address space available to the process. Increasing pagefile will not give you more memory available to a single process assuming that pagefile is set correctly.

.

So to do this, I go to properties of My Computer/Advanced>Performance/Settings>Advanced/Virtual Memory/Change>

Space available: 59165MB

What would I set the custum size too? stil 1.5x that of the RAM?

Iam also running two seprate hard drives. The second is a data copy of the drive in the serverPC and sync to it.

Will it help to set this drive aswell?

Edited by cad@sggsa
Link to comment

So to do this, I go to properties of My Computer/Advanced>Performance/Settings>Advanced/Virtual Memory/Change>

Space available: 59165MB

What would I set the custum size too? stil 1.5x that of the RAM?

Iam also running two seprate hard drives. The second is a data copy of the drive in the serverPC and sync to it.

Will it help to set this drive aswell?

Yes. You set the page size values for each disc/partition - click on each disc/partition and you will see the values change. For best results, you need to assign the page file to one disc that will give the best response. If both your discs are similar in performance, your second drive may be best if it does not have much disc activity - generally try and avoid the OS partition as this gets background I/O activity. You may want to create a small pagefile (say 200MB) to allow crash dumps to be written to it - I think you may be warned about this if you try and set system disc to no pagefile.

Then, on the partition that you are going to create the page file, set the custom value in both initial and max size to be identical - typically 1.5x the physical mem size. If you are running out of VM you can set it larger (say 2x) but too large a page file is detrimental to performance. Windows recommends a size that is 1.5x the physical RAM - in your case its likely to be 3072MB. In my case, out of 4GB physical RAM, 3580MB is usable, so my page file is 5370MB on my data disc and 200MB on my OS disc to hold crash dumps.

Finally, make sure that all other partitions are set with no page file.

If your disc does not have enough contiguous space to hold a 3xxxMB file, run a defrag too. As you are creating a page file of fixed size from the outset, the page file will not become further fragmented.

At some point in the process, you will need to reboot. If you can, change pagefile settings, defrag disc then do the reboot.

Link to comment

Thank you Ian

Your assistance is much appreciated.

The Data drive is set at 3031, Might be that it is the drive being used, will need to go see what the graphics say.

The C: drive was set to no page file with the customs left blank

So I can set C: Drive to 200MB for the dumping file...

It seems you have more selections as there are no other partitions...

Or do you mean that after the custom settings have been put in I select No Page File??

After all this I will recieve my MCSE Dip... :grin: :whistle:

Link to comment

Each setting is on a disc/partition by disc/partition basis. You only have two drivers/partitions so on the system drive, set custom setting default and max values to be 200MB and on data drive, make sure that default and max values are 3031MB. For each drive, ensure that the radio box to use custom value is checked.

When completed, it should say that your total VM/page file for all drives is 3231MB in size.

If you run out of Virtual Memory you can adjust the 3031 to be a little higher, say 2xRAM but I don't think you are running out of Virtual memory otherwise you would have said this as the error message clearly indicates its a VM problem rather than a standard out of memory problem. The 1.5x, 2x is just a rule of thumb. The correct value is slightly more than the maximum that is ever needed, but this is often not convenient to determine so the rule of thumb is a good starting point.

VM is a balancing act and often system specific, but doing the above should give a decent performance for most types of apps.

At the end of the day, you can't fit a quart into a pint pot.

Link to comment
  • 10 months later...

Good day All

As you see I did upgrade my RAM, well I got more of it in the format of two 2GB RAM sticks. As predicted and been told on here only 2.99GB is shown by the System.

Ok so it works....well in some ways.

When Rendering in VW, I could have a multipile of applications open and the CPU was running at 50%. Compared to only VW being open and CPU at 100%. So that aspect is wonderful.

There seems to be a problem with the System now, maybe a conflict, that there is to much RAM and the System tries and fix the problem.

In Task Manager there is a list of applications listed under Processes. Down at the Bottom you find System/SYSTEM with 236K mem usage. Normaly this runs at 0 and up to maybe 5 when opening app. It now at times run at 100%CPU and you are unable to do anything. This is so bad I can't ShutDown via Windows, but have to hold in the Power button on the PC.

Would anybody know why?

A friend mentioned that it is my RAM overshadowing my CPU.

Any comments are welcome as this is driving crazy.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...