Jump to content

jmanganelli

Member
  • Posts

    85
  • Joined

  • Last visited

Everything posted by jmanganelli

  1. @Taproot to your point, I initially wanted larger monitors —- but the price for a high quality 4K monitor with great color range goes up a lot after about 32” - at least that was the case as of December, 2019. If I were not being budget conscious, or if money were not a concern, I would have gone for at least (1) 44”+ or (1) 52”+ monitor so that I could proof a full 24x36 or, ideally, 30x42 sheet at 1:1 scale without having to print it out. But the costs of those monitors were still quite expensive at the time. I rationalized settling for the 32” monitor because it is big enough to proof a half size sheet of a 30x42 sheet at 1:1 —- which is very useful.
  2. @line-weight https://www.vectorworks.net/design-summit/virtual-keynote I believe it is in Dr. Sarkar’s keynote.
  3. @line-weight I believe that the new context menus that put tools at tool tips that they previewed a few months ago will address this. In rhino, it is easy to customize the mmb context menu and to access ALL tools from it so that it is not necessary to go to the tools lining the perimeter of the screen. there was a discussion about this topic in this forum 1-2 years ago, mostly from the perspective of making vw a better speed modeler see link below). It appears that the vw team listened and addressed this. The new workflow they previewed looks very promising. I look forward to trying it out.
  4. @jcaia the curve is slight. For most work I don’t notice it. There have been a few final renders that required extra scrutiny. But I generally correct to 2 point perspective, or render that way, so it turned out not to be a major issue. The bigger issue is the color fidelity, in my experience. For that reason, I picked a monitor with good color range and vibrancy.
  5. IESVE is no more expensive than the other major energy analysis tools and no more than getting into Lumion or a perpetual license of C4D. Sefeira and cove.tools are strong options for early design. I think for mechanical systems of moderate complexity, cove.tools can be used end to end. Now that sketchup is subscription only, there is a subscription package that includes sefaira. its an extra $1000/year over the sketchup subscription. Not nothing but a lot less than IESVE or design builder. I think that cove.tools is closer to IESVE’s price. simergy is supposed to be full-featured, like Design Builder or IESVE, but priced more like Sefaira. openstudio is free. there are many options at different price levels and a great option that is already fully integrated with Vectorworks. Other than revit, vectorworks is in as good a position as any other bim authoring tool with respect to energy analysis tools. a good path forward for more options is to go to the rhino forums and join the very small chorus of people asking for rhino.inside integration with vectorworks and let grasshopper replace marionette. Then all of the tools of the rhino/grasshopper ecosystem, including energy analysis tools ladybug/honey bee/climate studio would be available for analysis. Ladybug/honeybee are free and climate studio is closer in price to sefaira than IESVE.
  6. Does anyone know why Vectorworks doesn’t heavily promote that IESVE has Vectorworks integration (for a long time now)? That is pretty huge. The only other BIM package with which it is integrated is Revit. It is arguably the most comprehensive energy analysis program available, at least in the EU and North America. Why not just play that up and develop lots of training for that workflow?
  7. I have a large, floating 3D opengl view on a second monitor and my primary display divided into a large top plan view and then two small front and left views in wireframe mode. Sometimes on the second monitor I’ll creat an additional floating plane in top view with a marionette script open. All tools and properties panes are stacked next to each other on the far left side of the main display. another thing that has made a big difference. About 6 months ago I upgraded to two 4K, 32” monitors. Very nice to work on for long amounts of time. You can find them pretty cheap now. when I’m just on my laptop, I rarely use multiple views.
  8. Really cool! Beautiful! How much of a learning curve is there with C4D? Any stability or slowness issues? Have you tried lumion? The live sync plugin works really well. It is also very good for entourage. I completed a corporate campus render model in lumion 10.3 recently that covered an area of about 3/5 of a square mile and had about 6 buildings, roads, parking lots, sidewalks, a bridge, a hundred or so cars, 3D grass, a couple dozen people, and about 60,000 trees (campus included densely wooded foothills) and it was stable and fast with an Rtx 2080 ultra gpu and produced beautiful renderings. All entourage was done in Lumion. Fhd renderings took about 1-5 minutes each. 4K renderings took 2-12 minutes each but typically 2-5 minutes. 8k renderings took 10-40 minutes each.
  9. Presumably, if LEED is listed in the Results List of Energos as an optional analysis output, and choosing this option has Energos produce a LEED-specific analysis, which entails a different calculation method (ASHRAE 90.1 Appendix G for LEED versus PHPP for Passivhaus), then the calculation results should be different. In reality, if one switches the Results output from Passivhaus to LEED, the calculation results do not appear to change at all as far as I could determine in running sample analyses. But the results should be different. NYSERDA did a comparative analysis and found that results varied significantly between ASHRAE 90.1 and PHPP modeling and simulation approaches because they use different calculation methods. Here is a second comparative analysis that also found the two calculation methods resulted in very different results. Given this, what is the value of selecting a LEED Results report? How are the results indicative of what is required to meet LEED performance requirements? If these questions cannot be answered succinctly, then is this an unfinished tool that slipped off the radar of developers? Perhaps when the tool was launched in 2016 the intent was to build out functionality for LEED, BREEAM, and the other referenced standards in the future, so they left them as options in the Results list, but then the development never happened. I think the tools either need to be finished properly or removed from the list, OR the documentation has to come with a sufficiently detailed explanation of the usefulness and limitations of relating Passivhaus results to ASHRAE 90.1-referenced standards like LEED. I do not believe it is sufficient to say that, "...Energos really is a designed energy evaluation tool that taps into the architectural design process. It's not intended for energy certification and there are other software, often mandated, doing that job. Benefit of something like Energos is that you know where you stand before that certification process because you can check the building performance and how design changes affect overall results." Nor is it sufficient to say, "...the results may be different, although the starting point is exactly the same. let's not get crazy about the official energy standards." Someone who selects an estimated set of benchmark energy performance results with regard to LEED or BREEAM or any specific standard has a reasonable expectation that the results provided are actually based on the calculation methods required by that standard. The fact that Energos is not a complete compliance tool and that it is meant for early stage design decisions is irrelevant. If the tool gives me a "LEED Result" indicating that the design has good energy performance or bad energy performance and it turns out that that analysis is actually a Passivhaus analysis presented to me as a LEED analysis, and if I then assure the client that our preliminary LEED analysis indicates that our energy performance will be in good shape, or conversely that it is in bad shape and that we need to spend more money on additional or enhanced assemblies, when in reality I have unknowingly completed a Passivhaus analysis and the results are not at all indicative of ASHRAE 90.1 compliance, then it could lead to missed targets, missed certifications, damaged relationships, and litigation. To be more specific, it appears from the NYSERDA assessment that Passivhaus is more detailed in assessing the envelope, unsurprisingly, but less detailed in assessing the HVAC systems and automatic controls, also unsurprisingly. So if I have a more or less cube-shaped, mid-sized office or educational building in a temperate climate that is internal-energy-load-dominant and Passivhaus is not fully accounting for all of those internal energy loads and associated controls, then it is possible that the Passivhaus analysis will indicate that we need to increase the performance of the envelope systems to improve the performance of the building overall when in fact this is not the case. This could result in me spending time looking at higher performing envelope options and convincing the client to spend more on the envelope when in reality this is not the best use of resources because the loads and efficiencies are being driven more by the HVAC systems and automatic controls. Furthermore, what is the client going to say when, for building code energy performance compliance, which uses the same calculation methods as LEED, the energy model is simulated and shows that the modeled performance is wildly different than the initial 'LEED Results' analysis? The client could reasonably say, if the energy code requirements are based on the same calculation methods as LEED, then why is the initial energy analysis model so different than the compliance model when both models are modeling the same wall types, shape, orientation, roof types, floor types, percent/type of glazing and openings, etc.? Again, I think that Vectorworks needs to make a fix here. Either remove the LEED/BREEAM options, complete them, or provide documentation to explain their specific and limited use cases.
  10. Out of curiosity, have you looked at Maxwell Studio or Lumion?
  11. @Luka Stefanovic the Energos results range list specifically lists options of viewing results for LEED and BREEAM and other standards in addition to Passiv Haus. What is the value of a LEED or BREEAM results analysis? I think that this is where some of the confusion comes from.
  12. The only issue we’ve had with lumion is that it is not a light simulator, so depending on light, atmosphere, and settings, materials will look great, maybe even almost photorealistic, but may not be a completely accurate representation of what the materials will look like in reality. So it is important to manage client expectations if you do not think that the materials are rendering accurately (even if they look good).
  13. Yes, that is why I say it is like buying an Apple product or a Wacom product. The degree of refinement, stability, integration, and quality is exceptional......but you pay for it.....
  14. Has vectorworks looked at doing an implementation of amd pro render? it looks promising and is gpu agnostic.
  15. Corona Render is for 3D Max and Cinema 4D only. Redshift is coming to Vectorworks as an integrated service (see this article). Enscape is coming to Vectorworks as an integrated service and there is already a public beta that you can download and use. Twinmotion is coming to Vectorworks as a live sync plugin but may already be used if Vector models are exported to it. Artlantis already supports Vectworks with an integrated plugin. Maxwell studio can be used with Vectorworks, though there is no integrated plugin. Not sure about Octane/VRay/Maxwell. Lumion is outstanding for what it is (fast and can handle large models). Buying/using Lumion is a bit like buying/using an Apple or Wacom product --- it is very refined and well-designed for what it does. Regarding Renderworks, check out Luis M Ruiz' work: also check out: https://university.vectorworks.net/mod/scorm/player.php?a=22&currentorg=articulate_rise&scoid=44 https://university.vectorworks.net/mod/scorm/player.php?a=77&currentorg=articulate_rise&scoid=154
  16. FYI, this is a a good article about new features coming to Vectorworks 2021. https://www.world-architects.com/en/architecture-news/insight/the-bimness-of-vectorworks-2021
  17. The way Vectorworks affords sheet layers, design layers, viewports, worksheets, and classes and uses symbols is somewhat (and surprisingly) analogous to how Bentley's OpenBuildings (i.e., Microstation) arranges types of information into design models, drawing models, sheet models, databases, and cells and somewhat analogous to how Revit arranges sheet views, model views, detail views, its spreadsheets, and families. So it is possible/useful to look at workflows and organization for large, complex projects as implemented in the other BIM authoring tools for guidance setting up Vectorworks for large projects. Also, setting up a project in a federated approach with a shell model, a site model and/or a campus model, and then interior models is a standard workflow for large, complex models no matter the BIM authoring tool (more on this below). I say all of this to say that it is worth looking at how Bentley in particular organizes models, drawings, and sheets for large projects for two reasons: (1) Bentley specializes in design and documentation of large, complex models and has optimized their workflows and tools to this purpose; and (2) given that Vectorworks data structure, features, and functions afford similar workflows as used in OpenBuildings, it is possible to adapt Bentley's methods to using Vectorworks for large projects. Also, FYI, I have worked on large, complex projects most of my career. I have used OpenBuildings, Revit, and a little Vectorworks. For large, complex projects (e.g., multiple buildings, multiple floors, hundreds of thousands to millions of square feet under roof, hundreds of rooms, hundreds of doors, hundreds of walls), it is not possible to efficiently and effectively model a building in a single unified model in any BIM authoring tool. When we transitioned to Revit, we eventually had to use it as we had been using OpenBuildings in order to work quickly and effectively in the models. Neither Revit Server or Autodesk's cloud CDE SaaS could handle very large, unified models, especially with many people distributed between the U.S. and Europe working on them simultaneously, in a way that made it qualitatively different or superior to Bentley or Vectorworks. In the end, we still needed a federated approach on our own servers with Revit because having large, unified models was slow and having 3+ people all making changes in each model simultaneously caused issues with not all changes being saved to the central model and with the models getting corrupted (not a knock on Revit or Revit Server or Autodesk's SaaS CDEs, we were using the tools in ways that pushed their limits). I say this to say that I do not see Vectorworks as deficient in any way for work on large projects. Sure, it may not be as strong as another BIM authoring tool in some ways, but then in other ways it's stronger. Just trade offs between the tools. Anyway, worth looking at Bentley's training for how to set up federated models.
  18. I understand Energos role relative to Passiv Haus. The report function also lists report options for LEED and BREEAM. What is the value of these LEED and BREEAM reports? For instance, LEED’s energy performance requirements are based on the ASHRAE 90.1 defined baseline model and calculation methods. Is this part of the Energos analysis?
  19. @Andy Broomell I agree that a gumball or axis-locking is useful --- in addition to smart edges --- not in lieu of them. It is not an either/or issue, it is a both/and issue. For instance, both Rhino and SketchUP have good inferencing tools --- arguably among the best in the industry. And yet, when a large model with many assets is manipulated, especially if some assets are tightly clustered or are very large along an axis, it can be difficult to use inferencing because as the cursor is moved, it passes over so many points or edges that either (a) the system does not align according to the desired points/edges or (b) the linework is so dense that the user has a difficult time discerning which points or edges must be used as reference. This is an instance of where a gumball or axis locking is useful, even if there is a great inferencing system already in place. When a user is manipulating a single object/edge/point/polygon in the context of a dense and large set of objects and inferencing is struggling, being able to click on the gumball axis and/or use axis locking to move the object quickly and easily in a known direction is useful. In addition, it is useful to be able to reposition the gumball as a subroutine. For example, typically, gumballs show at object centers. For long objects, like a facade canopy, this can be problematic. If the canopy, for instance, must be moved some distance vertically - let's say 7" - to match the site lines for an existing adjacent canopy, and the gumball is in the middle of a 100 foot long canopy, then the user must zoom in to the gumball, start a reference line by clicking a start point, then zoom out, draw a reference line from the edge of the canopy horizontally over to the existing canopy, zoom in to verify where the reference line is ending, then measure the vertical distance between the horizontal reference line and the existing canopy, then delete the reference line, then zoom out, then select the new canopy, then pick the gumball and enter the new value, then zoom in to where the new canopy is adjacent to the existing canopy in order to verify that the command executed properly. This is a lot of zooming in and out and clicking and measuring. Conversely, if the gumball is repositionable, like in rhino, then the user selects the canopy, hovers the cursor over the gumball, holds down Control/Command while selecting the gumball, and then can reposition the gumball to a point on the new canopy adjacent to the existing canopy. Then the user can select the vertical axis of the gumball and then inference to a point on the existing canopy and the canopy shifts vertically to the correct site line elevation without creating reference lines, without zooming in/out/in/out/in/out, without having to mess with which reference point the inferencing system is grabbing, and without knowing the vertical distance the object must be moved. It takes a 30 second task and makes it a 5 second task and over the course of a day, time savings like these really add up. By the way, I can imagine a response along the lines of, 'with better asset/class/group/object management, something like a gumball or axis-locking is not needed in this example.' This is true. However, my counter point to this sentiment is that often we must work with models coming from somewhere/someone else and we do not have control over or full knowledge of how they modeled or organized their models. It may not be that the other modeler is messy. Rather, it may be that someone exported a model from Revit or another program and the export utility that converted the geometry and data organized the export file in a way that makes it difficult or messy to manipulate in Vectorworks. It is typically not cost effective or necessary to rebuild others' referenced models so that they are clean and efficient in our own models. Given this, a gumball also makes it easier to address the problem described above when we do not have the luxury of tight control over the content and organization of our models.
  20. @DBrown Since my Service Select Subscription is coming due, and this conversation has given me pause, I contacted support and asked if Vectorworks is fully committed to continuing to develop its architectural BIM tool. I received a firm 'yes' as a reply and that up-coming releases have BIM-focused improvements. As a separate but related matter, I just recalled that Vectorworks has continued to be the first or one of the first to certify their BIM authoring tool for each successive iteration of the IFC standard, and supports BCF. It strikes me that in that article, unlike other similar articles that I've read, the author does not quote Dr. Sarkar. Rather in this article, the author is providing his summary of the conversation. It would be useful to know what Dr. Sarkar said. When he's given interviews in the past, he's presented a clear vision for Vectorworks BIM capabilities and direction. In the recently recorded virtual keynote presentations, there is a heavy emphasis on BIM. Dr. Sarkar addresses several BIM topics and Rubina Siddiqui's presentation focuses on BIM and is worth watching. In addition to the topics that she covers in depth, if you go to the 4:54 mark, there is a slide with about 15 additional BIM/architecture-specific enhancements that they are developing. Great presentation. Strong BIM focus. I tend to think that Vectorworks is solidly committed to developing its BIM authoring tool. As per my comments above, I think that several of their initiatives demonstrate that they're positioning themselves for the future of BIM (4D-7D BIM) and beyond.
  21. @DBrown Thanks for the reference. I had not seen that article. Has Vectorworks indicated that they are refocusing away from offering a full-featured architectural BIM authoring tool?
  22. @VE4 No offense taken. I guess I feel compelled to communicate that there is an existing market for these aspects of BIM (and Vectorworks) and it is part of what makes me optimistic about its future. Sorry if it came off as taking offense. I suppose from my perspective, to use a sports analogy, I see Vectorworks like an overlooked draft prospect --- like Julian Edelman or Malcolm Butler --- who has a lot more potential and capability than it is often given credit for having. I think it is a remarkable platform. It is usually talked about with respect to small to medium-sized projects, though it is perfectly suited for large projects. It is discussed with regard to standard contemporary practice, though the capabilities it develops for the landscape, planning, and production/event design markets contain the essence of what it needs to engage an IoT world. I also feel compelled to communicate a warning that is in concert with some of the apprehensions expressed in this thread, albeit from a different perspective. While Vectorworks may be well-positioned to engage a data-driven, performance-based design future, it also can't wait to engage it or it will find itself woefully behind. I think that they're taking the steps they need to take. Time will tell. Good evening to you, too.
  23. @VE4 For the last six years, I've worked on industrial/biotech/pharma/government/entertainment/large corporate projects with budgets ranging from $1 million to $2 billion total installed cost and in CA, NC, NJ, PA, GA, IL, WV, and FL (Vectorworks gets used on big projects like these, too). The industry dynamics I described are already here for these project types when there is an aggressive schedule for a high capital cost, mission critical facility. Contractors are already incorporating drones, exoskeletons, ubiquitous RFID tagging, remote inspections, real-time analytics, wearable sensors, extensive use of 3D scanning, eyetracking, VR, 5D or 6D BIM, and limited digital twin technologies on projects even just a few million in construction cost (and have for at least a few years). When the construction schedule goes vertical (very tight schedules, construction starts before design is done, with many trades working side-by-side and one after the other on-site), such IoT technologies become useful risk and cost management tools. What I see more recently are these dynamics making their way down to smaller commercial projects. It makes sense. Contractors are driving adoption and contractors who do big work will also do small work to break into new markets or when markets are tight. GCs using these tools drive subs to use these tools, too, and on the large capital projects that I described above, subs already do use these tools. They also take on smaller work when needed. So these methods and tools make their way down to smaller, more common light commercial, K-12, and residential projects. And now Autodesk is buying up companies that offer many of these services or replicating them in-house and starting to incorporate them into their ecosystem of tools, also lowering the barrier of entry to use of these tools for smaller projects. These trends are not just academic and they're not 10+ years out. They're here today in large capital cost and mission critical projects. Maybe they're 3-5 years out for light commercial --- I don't know --- I haven't done light commercial in over a decade. But from a BIM authoring tool development roadmap perspective, these dynamics are effectively here already.
×
×
  • Create New...