Jump to content
Sign in to follow this  
P Retondo

Cloud services performance

Recommended Posts

My general experience with cloud-based computing is that it is not suitable to CAD because file sizes are so large the bandwith, especially for uploads, would make use unwieldy. My typical project can be upwards of 750MB, and with incremental saves every 100 operations, and frequent updates of linked files in a multi-file project - nightmare over the web. Bad enough having to wait over 90 seconds for a linked file to update in my sheets file just from my own desktop SSD. That alone makes it a non-starter for me, not even going to problems with fonts and other customizations. So the only use I could see is temporary archiving for use in another location or for basically transferring files, which I can do with Dropbox already. If NNA is thinking of moving to an entirely cloud-based operation like Google is trying to do with its apps, count me out! It'll never work.

Anyone else have a contrary experience or opinion?

Share this post


Link to post

Depends on the bandwidth of your network connection as well as how much data is being sent for things like autosave operations. In the first iteration of Project Sharing for instance, we transferred the entire file every single time, which for a 20MB file is nothing. But as you mentioned above; when it's nearly a gigabyte and you aren't on a fast pipe to the internet, it becomes a slogging nightmare. However, there are still huge chunks of the world that don't have the option of true high speed internet, so just saying "Well they should get a better connection" isn't a useful answer in our opinion, especially since we cater to so many different userbases across the entire planet.

The main way to fix this on our end (which is what we are continually doing as we revise Project Sharing and other cloud-related transfer systems, it probably won't ever show up in feature videos as it is being done bit by bit) it to shift to transferring ONLY the segments of data that were actually modified, so the save time and bandwidth need would be directly related to how much you were actually changing, not to how large your whole file is.

Most of the cloud tech we have been looking into is as an option, I do not see us going cloud-only like Google or Adobe have done because of the issues you've stated above as well as other technical reasons. We may very well eventually offer licenses of Vectorworks that run on cloud hardware, but I see this as a useful alternative only in certain circumstances, not something for everyone.

Share this post


Link to post

Jim, thanks, it's gratifying to hear that NNA are taking a pragmatic approach. Needless to say, "do your work from anywhere" doesn't work unless there is huge bandwidth everywhere. Speaking of incremental saves, is there any way that algorithm could be applied to updating a linked file? In v2015, at least, it takes up to ten times as long for a large "source" file to update compared to actually saving the same file to my hard drive.

Seems like if incremental saves and updates are a goal, the first place to apply the idea would be to updating a "target" file when a "source file" is edited. My 534MB design file (source) takes 10-15 seconds to save after changes. Updating the same source file in my target file takes 1 minute 28 seconds, to incorporate the exact same changes. That's approximately a 10:1 penalty for breaking a project into multiple files, which can be pretty time-consuming at the redmarks phase. I spend more time waiting for updates than making actual corrections to the sheets. (not that 10-15 seconds per save is in itself acceptable - should be a few seconds at most to save after adding a rectangle, don't you think?)

PS: 3.47 GHz Xeon processor, 64 bit OS, solid state hard drive

Share this post


Link to post

Some of that is more efficient decisions on when to transfer data and how much to transfer (Transfer being the same as Save in this case), but the other part actually ties into that ever-present multithreading discussion. Currently file save operations are all on the same single computational thread as everything else that isn't related to rendering. In the future, file saving will be able to take place in one or more threads, and the user will not be locked out while this is taking place as there will be other threads free to accept user input, so it will sort of be solved from both ends.

I am not sure how soon that will be getting attention however, the next big chunk of the multithreading upgrades you folks will see will likely be in the screen redraw and top/plan navigation, with things like file storage and data transfer coming after as needed. But with a lot of engineering focus being placed on accommodating multiple users working on large projects it might come earlier than I expect.

Share this post


Link to post

Thanks for the response - not sure if these initiatives would solve the problem of updating source files, since if the workflow at hand is change source -> save source -> update target -> output .pdf, one would have to wait for the update to take place before being able to correct the sheet. On the other hand, if, as you implied in your post regarding optimizing data transfers to and from the cloud, the transfer were isolated to the actual objects changed as opposed to re-writing the entire file, then we would be talking about something!

BTW, this problem is currently an impediment to structuring our files for a project. As you know, file bloat can occur when rendered viewports are maintained, and we want to retain them because of render times. The update issue makes it inefficient to break the project into multiple files, so the tendency is to minimize that, leading to said file size bloat.

Share this post


Link to post

Pete, saving a file mostly only involves writing data to disk. The size of the file to be written, the hard drive speed and the network speed (if saving on a network volume) will determine the speed of the operation. Incremental saving is a solution that could definitely improve saving as the amount of data to be written to disk could be significantly reduced. However, the Vectorworks file format doesn't support it yet.

Updating source (referenced) files is a completely different process. It does not involve writing the file to disk at all. When updating a source file, data in the target file that references data from the source file must be updated. Updating these data can be fast or slow depending on the changes made in the source file. For example, if a Wall Style was modified in the source file, all walls in the target file that use this style have to be updated. The number walls and their complexity will determine the speed of the updating process.

Just moving the data from the source to the target file is fast. It is reconciling that data within the target file and regenerating objects that is the expensive step.

If you are experiencing unexpected slowdowns when updating references, please send us a copy of the files and we will be happy to take a look at it (Tech@vectorworks.net). Your files could help us detect a particular configuration or interaction that is not well enough optimized.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

 

7150 Riverwood Drive, Columbia, Maryland 21046, USA   |   Contact Us:   410-290-5114

 

© 2018 Vectorworks, Inc. All Rights Reserved. Vectorworks, Inc. is part of the Nemetschek Group.

×