RyanMartinDesign Posted February 4, 2022 Share Posted February 4, 2022 Hi there, I'm running an M1 Max chip (32gb, 10core, Monterey) and I've noticed when working with complex hidden line viewports Vectorworks consumes and compresses large amounts of memory (Upwards of 50gb virtual memory). This is not unique to the Mac environment and I can replicate the same issues on my Windows machine. (Win11, 64gb RAM, R5 3600, 5700XT). The only fix is the save and quit Vectorworks causing the OS to dump memory and then I can reopen again with no issues (once the viewports have been updated of course!) 2 Quote Link to comment
station Posted February 25, 2022 Share Posted February 25, 2022 I'm having the same issue - running i7 8core with 32gb but this is with basic modelling. Am upgrading but not liking that processor with a bigger cache and twice the memory is still having problems Quote Link to comment
RyanMartinDesign Posted February 26, 2022 Author Share Posted February 26, 2022 15 hours ago, station said: I'm having the same issue - running i7 8core with 32gb but this is with basic modelling. Am upgrading but not liking that processor with a bigger cache and twice the memory is still having problems Feels like i've been able to replicate this problem for a few years, maybe a limitation of Vectorworks being able to handle larger scale projects? Quote Link to comment
Jeff Prince Posted February 26, 2022 Share Posted February 26, 2022 4 hours ago, RyanMartinDesign said: Feels like i've been able to replicate this problem for a few years, maybe a limitation of Vectorworks being able to handle larger scale projects? you are not alone. I’ve been experiencing this for several releases on machines ranging from 16 to 128 GB RAM. I guess vectorworks gets tired and needs a coffee break periodically 🙂 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.