Jump to content
  • 0

Best solution for a project sharing problem?


arquitextonica

Question

Hi All.
We are working over two different locations on a project sharing file.
Thing is that our IT has setup a server on each location and the servers are "synchronized" with each other. That means that there are "times" in which the VWXP files can be different as the colleagues working are doing it over different "phisical instances" of the files. (For stability both servers have the same disk letter and so on).
This has already produced "writing rights conflicts" in which an admin was told NOT to have writing rights over the file to send the changes.
The solution could be to work on a "one-drive-locally-saved" VWXP file... this synchronisation would be handled through one-drive and not our servers, and though I am sure it will lead to conflicts, it should be more stable, right?
What would be your solution otherwise?
Thanks in advance!!!
 

Link to comment

10 answers to this question

Recommended Posts

  • 0

Another thought.  Is there a reasonable speed network connection between the two servers?  If so, could you put a "shortcut" or "alias" to the true master VWXP file stored on one server?  The people at the location with the alias version will take longer (maybe much longer) when they checkout and checkin data, but it would prevent having to use an external file sharing service.

  • Like 1
Link to comment
  • 0
20 minutes ago, Pat Stanford said:

Another thought.  Is there a reasonable speed network connection between the two servers?  If so, could you put a "shortcut" or "alias" to the true master VWXP file stored on one server?  The people at the location with the alias version will take longer (maybe much longer) when they checkout and checkin data, but it would prevent having to use an external file sharing service.

I would go for file sync via OneDrive. Project Server is not really designed to work over the internet (yet). 

  • Like 1
Link to comment
  • 0

Thanks for the insight. The thing is that I believe IT is well intentioned, but not used to our way of working. Several hundreds of Mbs files that have to be constantly updated. That works on each location superb because the local servers are top class and very well managed, but if one of the locations as to access the other... they are hundreds of kilometers apart... one of them would suffer.

  • Like 1
Link to comment
  • 0
2 hours ago, arquitextonica said:

Thanks for the insight. The thing is that I believe IT is well intentioned, but not used to our way of working. Several hundreds of Mbs files that have to be constantly updated. That works on each location superb because the local servers are top class and very well managed, but if one of the locations as to access the other... they are hundreds of kilometers apart... one of them would suffer.

We have the same issue, a large project file ~400Mb (mainly due to a aerial imagery) and we're 1500km apart. We work on the file one at a time - ie we don't do the workgroup share thing as we've both seen it fail disastrously. The file resides on my server (Synology system) and the other party works on it and saves it back. But t's far from ideal - we had hoped we could keep a copy of the the aerials locally and have VW 'pick them up' and open from local but that has proved impossible.

Link to comment
  • 0

I think we work in a similar method - we have a server, people open their individual project files and save them back to the same server.  If they are working oversease or similar they do the same via teamviewer with no significant lag time.  However, the problems happen when the files are over about 250MG which usually is because of referenced in material - architectural models, multistorey buildings or just because of aerial images, people don't checkout items, people forget to save and commit for over a day, the server is backing up at the same time, more than one staff member saves and commits at the same time, or someone decided to save to their desktop and not the server, or creates two individual files. 

 

So yes a lot can go wrong, but the main problems we have are the larger files sizes with our small team of 5 or so working on the same project. 

 

Without a doubt though, there have been times when a person has saved and committed, another has refreshed 20minutes later and the data just isn't transferred across, no error message or any indication that it went wrong,  sometimes it seems to magically appear hours later.   Obviously there is a far amount of user behaviour involved but with those larger files it does seem a constant battle - even when everyone is working at the office. 

 

Link to comment
  • 0
On 8/4/2022 at 8:00 PM, arquitextonica said:

Thanks for the insight. The thing is that I believe IT is well intentioned, but not used to our way of working. Several hundreds of Mbs files that have to be constantly updated. That works on each location superb because the local servers are top class and very well managed, but if one of the locations as to access the other... they are hundreds of kilometers apart... one of them would suffer.

You're right, that one will suffer. And you will both suffer, but more moderately, using file syncing like OneDrive. It needs very good communication and a little patience. It's not as instant as a local Project Server.

 

Despite what I said about Project Server not being designed to work over the internet, it is possible. One way is for one team to use a VPN to access the Project Server in the other location, but you will hit performance issues with this option.

 

The other option is to open up a port and allow the other team direct access via external IP number. Performance will depend on how good the internet connection is where the server is located, particularly how fast the upload speed is and how low the latency is. The other problem with this option is that because Project Server isn't really designed to work over the internet there is no password protection, for instance. All security will rely on obscurity of your IP address and hackers not having an exploit to use (people would need Vectorworks to access the files).

 

The final option is to set up Project Server on a third party cloud server, hosted at a top tier host such as DigitalOcean. The problem with this option is the added expense (and the above issue of security). 

 

When Project Server was being developed I got it up and running on a VPS using the Docker image. I never really got round to comparing how good the performance was though because I ran out of free trial (and time to mess about) and didn't want to start paying anything at the time. I didn't want the hassle of managing the server either.

 

What we really need is for Vectorworks itself to add Project Server to their Cloud Services package. That's when we'll start using it.

 

If you're happy to use cloud hosting and your IT team is up for the challenge, what follows are the instructions I posted to the beta forum at the time to get Project Server up and running on DigitalOcean. It might be easier than this now, I don't know. I self-taught myself this through trial and error, so it comes without any guarantee 🙂

 

1. Open account on DigitalOcean. This referral link will give you $100 credit valid for 60 days: https://m.do.co/c/a055032998bb

 

2. Buy a domain name (or get a free one from Dreamhost or Tokelau or Names.co.uk) and point the domain name servers to DigitalOcean (DO). With Tokelau you may need to transfer the domain hosting to, say, Google Domains first, and then point the name servers at DO. I transferred mine to Gandi.net because they offer a very good, cost-effective email service too.

 

3. Use this one-click installer to create Docker Droplet on DigitalOcean. (if want to use a subdomain create an A record in DigitalOcean > Networking for the subdomain you want to use and put it to your Droplet.

 

4. Open terminal, ssh into server, type yes, and follow instructions to change password.

ssh root@use_your_droplet_ip

5. Create SSL keys for self-signed cert (there're also instructions here on how to install with a signed cert, although I haven't tried them)

mkdir local-certs && cd local-certs
openssl genrsa -out portainer.key 2048
openssl ecparam -genkey -name secp384r1 -out portainer.key
openssl req -new -x509 -sha256 -key portainer.key -out portainer.crt -days 3650

 

6. Create volume and install Portainer https://blog.ssdnodes.com/blog/portainer-docker-management

docker volume create portainer_data
docker run -d -p 443:9000 -v ~/local-certs:/certs -v ~/portainer:/data -v /var/run/docker.sock:/var/run/docker.sock portainer/portainer --ssl --sslcert /certs/portainer.crt --sslkey /certs/portainer.key

 

7. Access your Portainer at https://droplet_ip:9000 or https://domainname:9000 and set up via web GUI. Create admin user and password. Then tell it you want to run the local instance.

 

8. Get the project file server Docker image (from VW installer) and follow Scot's attached PSSPortainer instructions to set up your project server image.

(Although bear in mind he has a slightly different interface. For instance I had no way of selecting the image. I had to click on advanced and just type in the name of the image: project-sharing-server.)

 

8. Now try saving a project file from Vectorworks using https://domainname:9000 as your project file server.

PSSPortainer.docx

Edited by Christiaan
  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Answer this question...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...