Jump to content

IanH

Member
  • Posts

    795
  • Joined

  • Last visited

Everything posted by IanH

  1. Don't be tempted to make plant lists too large otherwise it will have a performance impact on workflow and VW.
  2. CSS is further evidence of keeping content abstract from the structure (in this case presentation) of the document. CSS controls attributes of the structure of the document (which in the case of xHTML influences the presentation when viewed in a browser) rather than inline markup of the content. It is the lack of inline markup in VW that the OP rant is all about and subsequent comments about the VW way being 'backward' or Mac orientated which have absolutely no merit.
  3. I don't think its an unreasonable amount of time. To know of compatibility you first need to test. When I worked in software development, our test/release cycle was about 10 weeks - 4 weeks system/regression test, 4 weeks partner systems tests/parallel with 4 weeks user tests, 2 weeks lockdown/last minute patches. Ours was a much larger system than VW but we also probably had more people working on it. So lets assume, without knowing the NNA setup, that release cycle is the same duration. The other unknowns are whether the 2009 testing could run in parallel with 2010 testing - it would not be unreasonable if it would not so had to run as to not impact with the schedule of the 2010 release. Bearing in mind that any bug found has the potential to invalidate testing done so far and cause the test cycle to be restarted, it is not unreasonable to wait until a stable/final 10.6 version is available otherwise you would simply end up having to go through the test cycle multiple times. It is a waste of time and resources to work with moving goalposts. To be honest though, for a bug fix to an OS change, you could probably mitigate the impact of the fix so that the risk to the release deadline was low, but that does not change the fact that to find bugs with 10.6 the system has to be tested and fixed and retested and that a change in 10.6 would impact and possibly invalidate significant amounts of testing done so far. The time frame offered by NNA is, IMHO certainly not unreasonable for a complex commercial product.
  4. I see that you still not knowing what you are talking about. Which part of "[ML] stands for the markup of the language rather than the content" don't you understand ? I don't have to read Dr. Lee's website. I was an XML SME for the worlds largest bank about 8 years ago and specialised in the design, implementation and testing of high performance real time 64 bit, multi threaded, XML based (along with many other technologies) financial systems. I think I know what I am talking about and understand the subtleties and differences of what is a markup language and what is markup content that you clearly cannot grasp.
  5. As I say, people, like yourself have become confused. Yes, the ML does stand for markup language, but it stands for the markup of the language rather than the content. Most electronic documents contain markup in some form, yet you don't see Word for example being called WordML because it has markup. As with many cases, you have obviously picked up on to one particular aspect without understanding the detail. My example of the difference between HTML and XHTML and you not seeing the differences yet latching on to some completely unrelated aspect is evidence that you do not understand what you are going on about. Yes. HTML and XHTML are based on markup, but that is where the similarity ends. In HTML, the markup describes the content, ie text, which is exactly the type of markup that you are talking about. However, in XHTML, the markup does not describe the text, it describes the structure of the document. This is a subtle yet very fundamental and important difference and allows the content to be abstract from the implementation. With HTML, a piece of markup is a trigger to the content. There needs to be no structure to it. With XHTML, the 'markup' is a property of the structure of the document, or more precisely, the lower levels of the document. It thus becomes an attribute to that section of the document and that section alone. Or in an object orientated world, it becomes an attribute (for the sake of confusion, not the same as an attribute in the language XML) to that instance of the document. Thus.. in HTML, the markup controls the text in XHTML, the text is controlled by the structure of the document, rather than any markup. Taking it one step further to reiterate your tenuous understanding of the subject... in XHTML, you may describe the XHTML document by means of a text representation, where by the structure is indeed defined by the markup in the text. However, you can then store the document, for instance in a b-tree (a binary tree, whereby objects have properties/attributes and may have child objects) which keeps the structure intact, yet all the markup has gone. The text is still perfectly correct and the web page can be rebuilt solely from the structure of the document and not any markup, yet with HTML, the text would still require the markup as it is part of the text rather than the document structure. Thus, with XHTML, the text and the markup are abstract, it is the properties/attributes of the document structure that control the representation of the page. Where as with HTML, it is not, it is the markup that controls the page layout. To the uninitiated, when viewing the source of a web page, the differences are very subtle if not totally transparent but when looked at in more detail, it is actually actually very fundamentally different.
  6. Actually I used the text menu which can of course be triggered by keystrokes - alt-xs9 etc.
  7. I don't see anything special with 10.6 that is suddenly going to make VW multi-threaded. Any serious application is going to need to be designed to take advantage of multi threading, rather than have some OS bolt magically enable it. Writing multi threaded apps is non trivial and if it was easy or economically viable, it would already have been done IMHO. There may be some speed benefits with having parts of the OS being multi threaded, but these will be insignificant compared with designing an application and its algorithms to be multi threaded in the first place. That said, I believe bits of the Parasolid kernel may be multi threaded and 2010 may make better use of this.
  8. You seem to be confusing program look and feel with underlying implementation. 'For at least the past 40 years' it has been recognised that the program at the user level should be abstract from the underlying technologies at many levels. Just because something is input in a particular way, it doesn't mean that it should be stored or processed in a particular way. The use of markup and an attribute/properties (not to be confused with a true object orientated approach such as provided by object orientated languages or operating systems) based characters can be quite happily interchanged and should have no bearing on what the underlying operating system or programming language offer. As technology has evolved, this abstraction has become easier and unfortunately in some cases become confused and abused. Take HTML and XHTML as two widely understood text languages. They are both quite different, one being markup and the other being based on an attribute based language, yet both very similar and both are equally at home being processed in a markup based approach or an object orientated/attribute way. But the end result is identical as far as the user (web user) is concerned, which is hardly surprising it is abstract from the underlying technology even though the underlying technology is identical - text albeit interpreted in quite different, yet highly similar, ways.
  9. I have no idea what "RAM overshadowing my CPU" means. I don't know how technical your friend is, but it sounds like a bit of misunderstood guess work to me. I would go back to basics, check page file settings (revert to recommended settings), check if BIOS is reporting the RAM correctly and if possible do a RAM test to make sure all the RAM is working correctly and timing settings etc are correct. Also double check that the RAM is compatible with your machine and other RAM that may already be present - looks like you may have swapped old out for new which is good. Then check compatibility of suspect devices and drivers, especially the NVidia graphics card as this is a memory mapped device that will shadow a significant part of your RAM. Google is your friend. Search "System/SYSTEM high cpu" and see if anyones posts look familiar.
  10. IanH

    Vegetation line

    Are you sure you are not using the plant line tool instead of the vegetation line tool with the cloud option? Also, the min and max radius are important to set, especially if you are running metric because by default the radius were initially in inches and if you are working in say mm, the values will be approx 25x too small. Also consider using the revision tool (then ungroup and split as needed) or new planting area tool for quick and dirty cloud style lines. So it will be worth revisiting the options and the min and max radius. That said, I just noticed that my veg lines are appearing inverted, ie concave within the objects rather than convex outside the objects. There may be a bug.
  11. The upgrade price of Vectorworks depends on what version you are upgrading from. Also, as you are going from fundamentals to architect, a tradeup is also involved. The prices below are current rrp ex vat prices for a product tradeup from Fundamentals to Architect 2009. This will hopefully give you some idea as to whether it is worth upgrading from 12 to 2009 or waiting for 2010. All prices are for 2nd license. 1st licence will be more. ?1103 Architect 2009 ? 816 Fundamentals Pre 10 tradeup to Architect 2009 ? 713 Fundamentals 10/11 tradeup to Architect 2009 ? 666 Fundamentals 12 tradeup to Architect 2009 ? 524 Fundamentals 2008 tradeup Architect ? 264 Architect 2008 upgrade to Architect 2009 There may very well be a price change and different upgrade matrix when 2010 comes out, but if all things stand equal, its going to cost ?666 to upgrade to 2009 now, with a potential further upgrade of 264 (total 930) if you decide to go 2010 at a later date, or theoretically, 713 if you wait until 2010 comes out. So it all comes down to whether you want 2009 (?666) or wait till 2010 (?713). Your reseller may well have different pricing so they can give a firm price. The above are simply indicative rrp prices.
  12. What bits are you looking for and for what discipline? I have one of Archoncads 2008 Landscape books that I would be happy to sell. I skimmed through but did not find it relevant to me so is in mint condition - may not be what you are after though. There are also online training resources on the NNA website.
  13. I agree with all of the above. 10 years ago I bought an SGI 540, fully loaded with quad processors and I regretted that decision 3 months after purchase and have done ever since. The difference between a, in alienware terms, kick-ass machine, and a well sorted off the shelf PC in real world terms is going to be much smaller than you think. For instance, the difference in speed between a 3.3GHz processor and a 3GHz processor is 10%. Renderworks will therefor, in a perfect world, only get a speed boost of 10% (the graphics card will have negligible effect on renderworks speed) so your 14 hour part finished render will still be a 13 hour part render. What you gain in processor speed, you may loose in choosing a 64bit OS and running a 32bit app on it. Before long, your machine will be bettered by something a fraction of the price and your model will be twice the size and take twice the time to render so you will be worse off speed wise. Your $5299 will then feel like a waste of money whilst some minor changes to your current hardware, software and working practices may be more cost effective, albeit without a shiny new box with blue lights, making a hell of a racket, generating a lot of heat and using copious amounts of electricity. IMHO.
  14. I'm just guessing, but could it have anything to do with plan rotate functionality or dlvp's not being available prior to 2008.
  15. I would imagine that more use will be made use of the Parasolid engine. Its presumably a significant investment and it was stated at launch of 2009 that only about 60% of the VW kernel has been swapped out to use the Parasolid engine so I expect that to grow along with more advantage being taken of the Parasolid multi threading capabilities to give VW a speed boost on multi core systems. There is also a component of Parasolid that I understand has been licensed but we have yet to see anything of, although some work has obviously been done in this area as 2D constraints seemed to have improved significantly in 2009. I'm only guessing here, but it may be the 3D Dimensional Constraint Manager which allows dimensions to be bounded for upper and lower constraints for distance and angle. I believe that an example of this type of technology (although I don't know if Parasolid was used) was used in the design of the Swiss Re Centre in London where the CAD model of the thin shell structure can be deformed within the dimensional constraints of the model. I would also expect a few bugs to have been resolved and a few more introduced!
  16. Vectorworks offers far more powerful modelling modes than Sketchup. Sketchup can get away with doing things (planar surfaces) quick and easy because it only has easy things to deal with so it does its one job very well. Vectorworks on the other hand, has to deal with far more complex ways of doing things (any mix of planar surfaces, solid modeling, NURBs surfaces and 3D parametrics) which means that Vectorworks cannot easily benefit from some of tricks that packages offering less powerful ways of doing things can get away with, which in the early days of working with Vectorworks, may come across to some as being a less powerful option. Whether the user needs the power of Vectorworks is simply down to the individual users requirements and if they do, they are going to have to accept its going to take a little longer to grasp.
  17. It could be any number of things. You don't say how large your model is and what you were doing at the time. Also, it would be useful to have some diagnostics from the performance monitor - see below. Make sure that you have the latest service patches, IIRC that there was a performance issue that was fixed on one of the SP's. A very quick guide to the Windows performance monitor... Vectorworks can be very intensive on resources, so running out of memory or high CPU usage can be perfectly normal. Large models, rendering, high resolution graphics, textures, images etc are all resource intensive. The performance monitor is available from the task monitor under the performance tab. If you are not running admin rights, then there may be other processes that you cannot see directly, but the performance tab will give you an overall feel as to what is going on and what is running low on resources. If you are running 2 or more CPU's/cores. If both CPU's are near 100% and you are not rendering, then it is likely that VW is not at fault - VW only uses 1 CPU so if both are high, then something else is also causing excessive CPU usage. You can see what the culprits are by looking at the processes tab and sorting on CPU usage. If running 1 cpu, sort on CPU to see what processes are eating up the cycles. If Vectorworks is one of them, and its not 100% CPU or very close all of the time, and you are not rendering, then a faster processor may help. IF vectorworks is 100% or very close, most of the time, then is possibly an application issue. Close it and restart. If page file usage is very active, then you are probably running a little low on RAM as physical RAM is pages back and forward to page file. Close down any unused applications and memory hogs - you can see these by viewing the processes tab and sorting on memory usage. RAM is cheap, so if it is available and easy to fit, then maybe fit another GB. Revisit page file settings after adding more RAM. If the values in the commit charge section are very similar, then you are likely to be running close to the limit of virtual memory. Either let windows control page file size itself or up the page file size yourself - a general rule of thumb for page file usage use to be 1.5 times the RAM size, but in the days of large amounts of RAM, this does not necessarily hold true any more, but it is a starting guide. This is controlled under the advanced/performance tab in the system properties - right click my computer then properties. A reboot is likely. Rogue applications can be killed, either by terminating the process, or for stubbon applications, the whole process tree. Hope this helps.
  18. Unless you are rendering in OpenGL, the speed of the video card has little influence on the speed of rendering. Other render modes are largely CPU bound.
  19. Unless your machine is short of memory in which case a lot of virtual memory (disc) paging activity will occur, then once an application is up and running, then hard disc speed will not have too much of an impact on speed of the application unless by nature it loads a lot of information continuously from disc - which does not happen when rendering as everything is in memory. The benefit of a 5400 disc over 7200 is likely to be cooler running and longer battery life. The disadvantage is slightly longer OS and application load times. But check the power consumption of each disc option as specifications can vary between models so this is only a rule of thumb. For a laptop that will be run on battery, I would likely go 5400, but for a desktop, I would go 7200 or faster.
  20. Pats way is traditionally one of the ways of doing this however it does not necessarily take into account negative numbers if you need to handle them. If you need to handle negative numbers, then it is slightly more complex. Pascal often has a sgn function that returns -1 or +1 depending on the sign of the number. VS does not have this so if you do need to handle negative numbers, you can either create a function to mimic sgn and use the formula below. However, your application may require that rounding of negative numbers cannot be accomplished with the round function in a simple formula, you will need to revert to using an if statement for the whole formula that implements the formula slightly differently depending on whether you want a negative number rounded up or down. i := ROUND( ABS( criteria ) + 0.5 )) * sgn( criteria ); FUNCTION sgn( n : REAL ) : LONGINT; BEGIN IF n < 0 THEN sgn := -1 ELSE sgn := 1; END; You also need to test that the 0.5 added to 1 will not round up to 2. Some Pascal implementations might and it may even be processor specific resulting in rounding errors that mean that the value you see on screen (ie 0.5) is really a slightly different number (ie 0.50000000001) internally. In this case, the 0.5 may need to be 0.49... depending on the potential precision of the input value and floating point number in use.
  21. Use the place plant tool and set the spread as required. This will create a 2D copy of the plant symbol resize it for you.
  22. What are you printing/plotting on? There can be many places where output can be set to rotate, especially on a plotter that may allow you to auto rotate the plan to economically fit on roll paper.
  23. If it is the problem, then you could possibly write a script on the Mac that kept the network drive 'alive'. At a very basic level, you could create a dummy file on the problem drive(s) and write some background apple script (or whatever it is now called) on one/all of the Macs that either opened for readonly&closed or directory listed the file then went to sleep for a period of time (maybe 10 minutes) then started over again. This may be enough to keep the drive alive. Its been over 20 years since I last programmed a Mac (well before OSX days) but its likely to be only a handful of lines of script. If this solved the problem, then you can either stick with the script, or investigate the root cause of the problem which may be something intentional, such as spinning a disc after a period of inactivity down to conserve power. My server discs spin down on my NAS box after 30 minutes of inactivity and probably take 10 seconds or so to come back online, by which time, iTunes has concluded that the iTunes library is missing and thrown an error. For me this is a minor inconvenience compared to the power that I save by having the discs not spinning 24/7, but in an office environment, this may not be a consideration.
  24. Interesting that the files are on a server. The reason why I initially asked about location was because I had a theory (and only a theory) regarding files on a remote device. I am wondering if, when you open the referencing file, that you don't at that point have a connection to the referenced file. Vectorworks can't then find the file, and brings up the lost reference. In the mean time, the OS reconnects to the office server, and all is then fine as you can see the file. Whilst I have never seen this with Vectorworks (all my files are local), I get a similar thing happen with iTunes where my music folder is stored on a remote disc. Sometimes when I open iTunes, is says that it cannot find the library and I need to relocate it. When I relocate it, the network drive is available again, and it all links back up ok. The reason why iTunes cannot find the file is either that the network drive has briefly dropped off the network or, the drive has powered down in power saving mode and takes a few moments to spin backup to speed and come back online. I wonder if this is what is happening to you, but with Vectorworks.
×
×
  • Create New...