Jump to content

PVA - Admin

Vectorworks, Inc Employee
  • Posts

    12,808
  • Joined

  • Last visited

Everything posted by PVA - Admin

  1. @M5dThe benchmarking referred to in this thread is not the benchmarking of hardware nor the effect of hardware on the software, but the benching of versions compared directly against eachother. As suggested further above, in identical files on identical hardware, to give an idea of whether the software itself is slowing down as releases progress. I plan to do this testing and post it, I hypothesize that for things like geometry calculation and duplication, we will see speeds increase for 2015, and then remain very much the same until today on the same hardware under the same test conditions, we will see if that pans out. This information is as valuable to me in working to improve the software as it is to you all in your decision of whether to purchase in the first place, upgrade regularly, or to choose another package entirely. I don't want anyone using Vectorworks that doesn't feel it is worth its cost, and I am not personally willing to deceive people in order to drive sales. Whether I like a tone or not is immaterial. It is imperative that this forum be kept clean of false accusations and misinformation. If there is information I posses that I do not release, it is because that information is still covered under NDA. The things you are saying are incredibly important and need to be said, thank you for saying them, it is only the tone and the accusation of intentional deception that I have to take issue with. Lets just drop this portion of the debate entirely and keep working to make things better.
  2. Contact tech@vectorworks.net with the last 6 of your 2011 serial number, they may be able to get you a copy.
  3. That chart isnt, no, it's strictly volume of crashes in total. When we do the mega charts internally this is shown followed by comparisons like crashes-per-day-per-user. We also have to account for other things like time of year (December and January are generally quiet in terms of how many people are using the software) as well as things like OS compatibility. For instance, there were a huge load of crashes related specifically to Mojave, but we can also separate the Windows and Mac data to see if it really is a spike just because of the OS or if crash rate is raising or lowering independent of the OS. I actually love the heck out of our analytics tracking, and engineering has been repeatedly doubling down on it. For instance, I got a real morale boost when we started discussing how "Quality is not a Switch" and not something that will simply exist or not exist within a version, but something that must be tracked and catered to as much as any other aspect. We always cared before of course, but now we care AND have tools to back up our decisions with data. It doesn't go by machine year that I am aware of, but we can indeed peel off and look at JUST iMac Pro configurations, or MacBook Pro, etc and then effectively sort out specific ones by CPU or GPU options. It's less easy on the Windows side since there are so many configurations, but on that side we can also split it up by GPU or CPU to see if there's a hardware specific trend. We have a lot of alerts set up now to warn us ASAP if something like "Anyone with a GTX 9800 is doomed" happens after an update for instance. I can share and discuss these metrics to some extent, but there are things I am not permitted to reveal because of our user privacy rules, which I am more than happy to abide by. Too many companies let that stuff slip these days. I will gladly answer all questions related to it that I can within that ruleset.
  4. We cant just send the fixed ones back one at a time, they have to be pushed in a build. Eventually, we want to be able to correct them piecemeal and then send the updates out and have it patch in the background so you never even know you had a bug, but our tech and process aren't ready for that yet.
  5. It is incredibly unlikely that 2018 will receive another patch ever again.
  6. Unfortunately, that's the exact opposite of a benchmark. It is the subjective experience one has when using software, as most workflows of our users are not something that is standardized and comparable to most others. It is the MOST important metric in my opinion as well, but it is so personalized that it has to be judged in a much more human way than something like objectively faster rendering speeds and load times. This is kind of the problem I'm getting at, if we start dictating what a typical residential or commercial project is, sure that's possible. The fact that we DON'T dictate what is typical and what is out-of-spec or beyond design intent is the key issue. I can pop out files that *I* consider standard, sure but a lot of the time when getting feedback it becomes "Well obviously you would use Floors and Never Slabs" or "No one uses tool X everyone uses Y" followed shortly by "Thats preposterous, Y is used by everyone I know and their dog, X is unheard of and used only in Lichtenstein." This is of course an over dramatic comparison, but I am still finishing the first cup of coffee. The key thing I want to convey is: I WANT to provide metrics and benchmarks and version comparisons, it is merely deciding the proper way to do so that can be conveyed in a meaningful way. Because they are two logically separate topics, one asking for official benchmarks on speed (objective) and one concerned with a perceived decline in speed across versions (subjective). There's no way I could possibly carry on both conversations in a single thread without it getting out of control. I posted a question asking specifically for what kind of things users wanted to see in benchmarks and that I see the value of them, how did you arrive at the conclusion that I/we wanted the opposite of that? Though to be clear; regardless of how frustrated you may be, the passive aggressive tone that has so permeated our media and politics will not be accepted here: I need you to dial it back a bit. In any case, us ignoring a problem like that isn't possible, you all could easily just post benchmarks refuting any false claims we made or lack of claims we made. I'm not trying to NOT show benchmarks, I'm trying to ONLY show benchmarks that are: 1) Factual (And not just in ideal conditions) 2) Beneficial to a majority of users 3) Not lying by omission (Looking at those ever-changing focus metrics Apple uses at their keynotes) 4) Useful in purchasing decisions It is no secret at ALL that geometry calculation speeds have not changed between versions. This is because of that single thread Core geometry engine I've discussed here so often. That issue pretty much just translates into a flat line chart however, not one that is increasing or decreasing. You're pretty much going to get the same time results across versions for things like duplicating an array of cubes or importing DWGs with a set number of polygons. If those kinds of charts are what people really want, sure I'll post them, but I don't think they help OR hurt. The problem with geometry calc slowness is known and talking more about it or filing more requests related to it will not change it any faster. It is already being worked on. If it was possible to make that go faster, I'd be doing whatever made that happen instead of writing this post. I normally don't share things like this because of their complex nature, but heres an idea of what's going on from an analytical side. Speed is VERY hard to pull anonymous metrics on, but Vectorworks crashes are quite trackable: The color key for the above chart: From left to right, the weekly number of crashes we get from the various versions of Vectorworks. I began this filtered chart at the launch of 2018 SP0. The big dark and light blue bits in the center are Vectorworks 2018 SP2 and Vectorworks 2018 SP3. Versions are stacked from top to bottom on these charts, the top being the oldest version included and the bottom being the latest. By far, the most unstable versions of Vectorworks were the middle of 2018's life cycle. The small green and red bits on the bottom right are 2019 SP1 and SP2. Crashing has reduced SIGNIFICANTLY in that time, yet we still have reports thinking that 2019 is more unstable than past versions. This is because these simple metrics aren't enough to include everything there is to the experience of working with a set of tools like ours. Its all too easy to discount claims of slowness or instability with metrics like the one above, but we choose not hide behind metrics like this because we know it can't possibly tell the whole story and because we use it ourselves and see that what you all are saying FEELS true. We will be sharing more metrics in the future. We will be making more benchmark-style reports available in the future. If you all can provide me with specific objective comparisons you would like to see, I will provide them. That's the point of this thread.
  7. I am unable to replicate this here, get in touch directly with tech@vectorworks.net, they will likely request this from your machine:
  8. You may have accidentally moved to the 3D view. Go to View > Standard Views and select Top/PLan. It might currently be on Top.
  9. I need the system profile from that machine please, the above link will get you the steps needed and you should be able to drag/drop it here as an attachment.
  10. Once you get the chain and snaps made though, this tool may become useful to you: http://app-help.vectorworks.net/2019/eng/index.htm#t=VW2019_Guide%2FAnnotation%2FCreating_Repetitive_Unit_Details.htm
  11. I mean draw the 3D loci afterwards, separately from the extrusion. I recommend selecting one link and one 3d loci, then using Right Click > Align/Distribute > Center followed by Right Click > Align/Distribute > Center. That should give you a perfectly centered snapping point. Then either duplicate that snap point and move it in 3D via Modify > Move the desired amount relative to the overall dimensions of the object and where you want the snap points to be and then repeat that operation duplicating the center snap loci each time you need to offset a new one. You should only need to do this to one link, then just select that link and its needed loci and duplicate/rotate that into place, then save the two links and their identical snap points as a repeated symbol.
  12. This very question keeps many of us up at night. It's a big one. The fact that we also cater to many different industries that use varying levels of information inclusion in their models (For instance, the Entertainment folk have effectively been doing their version of BIM for quite awhile, and certain aspect of Site Design have been data driven since I started here.) and the right answer for one is not always the right answer for all of them. I am very glad for once that the method in which we handle this is above my pay grade, its an industry-wide big ticket question in a lot of ways.
  13. I think the quickest way would be to add 3D Loci objects at the points you want to be snappable, then with the link geometry and the 3D loci selected, create a symbol. Most likely with the symbol's insertion point being aligned one of the snappable points. That should let you both insert it at one of the snap points if desired, in addition to keeping file size down and letting you mod all the chain links at once later if need be.
  14. Did you register with a school in one country but choose your location as being in another? That may have gotten it stuck, but if you emailed your local support however they should be able to correct it, I do not have admin control over that region.
  15. This came up recently in another thread, but I think it merits it's own discussion: I want to do this. Some of our distributors have directly asked us for something similar. The key difficulty I'm running into is: I have yet to find a way to even come close to showing what a "standard" file is. It cant be defined by number of objects, since objects have a broad range of complexity and various types of objects affect performance in various ways. It can't be defined by file size, since a dramatic difference in file size can occur even in two files that have the same geometry related to how cleanly the resources in each file were managed. For instance, a common issue we run into is a report of something like "Sheet Layers Are Slow" which can have a bunch of different causes. The core one I see most often being a user only using a single sheet layer, and then putting dozens or sometimes even hundreds of instances of a Title Block object on it, which will brings things to a crawl. Title Blocks are optimized so that they are only recalculated and loaded when a sheet is viewed, or right before its printed/exported. If ALL title blocks are on a single sheet layer, this optimization becomes worthless and you have to wait for all sheets to update in order to continue working once you switch to that single sheet layer. This means that in order to define performance, I would also have to effectively dictate workflow, something we have not done before in most cases. The general rule in Vectorworks has long been "You can do almost everything 5 different ways, the correct one is the one that works best for you." and I really do love that. But making standardized performance indicators for Vectorworks runs exactly in opposition to this mindset in every way I've been able to come up with. SOME parts of Vectorworks are easy to benchmark, such as rendering speed, which is why you have all likely seen those that I have posted. Those are much more cut and dry, as I can test the exact same scene across different hardware and the resulting rendering time is a direct relation to performance. Things like duplicating arrays of objects, doing complex geometrical calculations etc, do not result in times that vary directly based on hardware performance, since a lot of the slowness in those operations is currently a Vectorworks software limitation and not the fault of your hardware. Until these processes are moved to multiple threads I don't think they will be benchmark-able in a meaningful way. (To clarify, I TRIED to benchmark them in a meaningful way, and got more variance in the completion time based on what other applications were open more so than what hardware I used. ) I would very much like to hear any suggestions or feedback if anyone can see an avenue to approach this that I have missed. I miss plenty. I will not be working on this for some time, but I wanted to go ahead and pop this discussion up and take responses while it was fresh in my mind. I would also like to hear the KINDS of performance indicators you all are interested in so that I can ponder on how best to provide them in a technically simple but accurate manner.
  16. Do you mean 3D models of pinball/arcade cabinets that you would use IN a rendering? I would suggest taking a look at 3Dwarehouse: https://3dwarehouse.sketchup.com/search/?q=arcade cabinet&searchTab=model
  17. What version and service pack of Vectorworks? Reply back with the following from your machine as well and I can take a look:
  18. It had to be removed from the Styles, otherwise if you wanted one slab transparent and another drawn normally, youd have to create a whole new style just for that purpose.
  19. Please reply back with the system profile I requested as well as the requested screenshot. I did not, what made you think this? If theres a UI element on the forum that confused you, please let me know and I can try to clean it up.
  20. This slowness only happened after installing it on the newer machine, or it was happening before and the newer machine was an attempt to speed things up? The files were fine in 2017 and only slow in 2018/2019? Could you please attach a screenshot of an entire normal sheet layer for one of your files? Reply back with the following as well, please:
  21. Hrmm that isn't screen/layer redraw then. Make sure to have tech@vectorworks.net take a look in that case, might be some stuck geometry slowing you down unduly. 1800 isn't THAT many objects and shouldn't start crawling unless they each have hundreds or thousands of vertices each or something like that.
  22. We can narrow down where the issue is with a symptom like that: Do you get the same slowdowns with Tools > Options > Vectorworks Preferences > Display > Navigation Graphics set to both the top and bottom settings in a quick test? Or does one seem much slower than the other?
  23. This issue was already submitted to tech@vectorworks.net directly, or to your distributor? The forum is mainly for user interaction, it is not a formal support channel.
  24. I think of it as my Industrial Cutting Laser.
  25. NOTE: None of this is intended as any kind of excuse or reason for delay, I simply wanted to give insight into what's going on and how things look internally: Tech now uses teamviewer daily for exactly this reason, it works excellently. This is an exceedingly good list of practices. My goal is to make many of them a distant memory however 😉 The message is there, it's just going to take time for the change to get all the way through the pipe. I mean you can certainly do this if you feel it's merited, I don't want anyone paying us for something that doesn't do what they need, but from what I can see that would not make it happen faster. We agree, this is something we are doing more and more often, however it wouldn't be the techs that would go normally. Tech support normally handles issues like installation and driver compatibility and configuration and "basic" usage questions, but they don't have the resources to expand into even moderately complex workflows (Project sharing outside of standard advised practices, interoperability with other packages, working with references that rely on other origin systems, etc) so a lot of that falls on what we have here called Product Specialists. However, these two teams are in completely different departments because they developed independently. We identified this as a problem a ways back and have been working since then to get the two disparate groups more meshed together to develop complex workflows (I use the term "complex" to indicate merely more than explaining wall component heights for instance, its relative.) and create more standardized training content and whitepapers on how we at least EXPECT users to work with the various combinations of tools. I can not open my mouth about the specific things I want to open my mouth about at the moment, I'm in muzzle mode for a pretty damn cool project coming up, but I wanted to deliver as much of an answer as possible at the present time.
×
×
  • Create New...