Jump to content
Sign in to follow this  
Petri

Large DTMs

Recommended Posts

It is quite obvious that VW 9 cannot create a DTM of my current data set, with 8700 3D polygons and 164000 vertices, but how about VW 11?

Share this post


Link to post

This is going to be interesting - VW 9 could not model the data even as 10 separate sections.

Share this post


Link to post

Petri, your data is a highly detailed topo map. You should also read carefully my comments in an earlier discussion here

Share this post


Link to post

Petri, I've reviewed your file. You have a detailed topo plan consisting of (8867) 3D polygons, with an average of about 18 vertices. 8867 x 18 = 165000 points. Generally, the DTM is designed to use much less dense data and create topo plans (which you already have in effect) and site models. You call this "real world data" but what is really is (in DTM terms,) is "real world output".

You're starting with a data set that has already been generated from a DTM from a much sparser set of data and the extra data points are (as you should know, being a DTM expert) extraneous. They serve to do nothing but clog up the DTM engine.

To make calculation easier, you can either:

A. Get a set of the actual (sparse) data used to produce the topo and use that as input into the DTM engine;

or

B. Do an intelligent filtering of the 3D polys to significantly reduce point density.

We provide a command for (B) called (appropriately enough) "Filter 3D polys". Try using it and setting your filter value to between one foot to two feet (35 - 70 cms). Your dataset is so large that

So, what's the moral here? The idea is to understand both what your end goal is and to understand what you're starting with. The real purpose of the DTM is to taks relatively sparse data, interpolate it, and make topo and model data from it. You don't gain anything by making a DTM model from data that already been interpolated. But when that's the only data you have, you shouldn't feel bad about throwing some of that data away. (We are after all talking about dirt here).

Petri, I will admit up front that our DTM has not yet reached the levels of usability that will allow us to call it "DTM for Dummies". You do have to think to use it -- you can't just throw anything at it. On the other hand, we have continually worked on improving it, as Peter Cipes has already attested. And we continue to do so.

Continued improvement is our goal and we will continue to strive to meet that goal. And we will continually strive to make our DTM both more reliable and more usable.

Share this post


Link to post

No, it is not a DTM but either a digitzed paper map or one generated from stereo aerial photos, probably latter. Topgraphic maps are not created via DTM, so your condescending, canned reply is fundamentally flawed and you are seriously misinformed. There IS no actual data set, so this is the real world.

And even if there were, in the real world there is no hope for anyone to gain access to original data held by Government cartographic agencies - if you think otherwise, you live in a dream.

As comes to filtering, as I already explained, I have tried that, coming down to 2700 points in one section, but still can't get the DTM generated. I would not call 2700 points a large data set.

Share this post


Link to post

Robert,

I tend to agree with Petri and I accept the fact that Vectorworks was never intended to be a dedicated digital terrain modelling package, although the name "Landmark" gives a certain market impression which to some prospective buyers may infers that level of functionality. I personally don't believe that NNA intended to deliver that message but probably in hindsight "Vectorworks Landscaper" or similar would have been a better choice of name. Despite this, I personally like Vectorwork's simplicity and don't particularly want all the functionality that is present in market specific DTM packages. These programs usually having very clunky and complicated user interfaces, but do a vast array of clever things that I and many other users no doubt will probably never use; including: modules for collection and translation survey instrument data, interfaces to hydrological models and specialized software tools which handle the automated production of roads and civil works documentation. I do have the strong conviction that even if VW Landmark's is has the most basic DTM contour modelling functionality, it should still produce correct information, i.e. that which is consistent with natural landforms and not distort those landforms in anyway, but as you know I have given many fine examples to NNA of distorted Vectorworks DTM output. Many inexperienced users unfortunately use this package and probably never question the veracity of its output. This is evident if you look at the NNA's bulletin board. A typical query goes something like this: "I finaly was able to get the 2d polys turned into 3d contours, and use those to modify the surface. The problem is that when I go to AEC\Site Model to update, it works but the surface and contours it generates aren't close to what I want. What I am doing wrong or leaving out."

NNA often answers but sometimes misinforms their users thereby perpetuating the myth that it is indeed a user related problem or misunderstanding. I am suggesting is that maybe NNA has a legitimate basis on which to jump ship to another but more robust DTM kernel.

NNA must in the interim give their users viable workaround strategies. The ability to simply insert a 3D TIN produced by a surveyor within VW 11 and then generate from this an accurate 2D contour, would be a good first start - from my understanding this functionality does not exist. The importance of feature cannot be understated; it must be understood by all, that the survey TIN produced by a surveyor is the only true basis on which to form a 2D contour. It is they that know the true nature of the landform, since they have surveyed it. Therefore it is only they that know where to put all the relevant break-lines and the like. In addition most times they are using more sophisticated DTM packages to produce the final TIN. Robert if I am incorrect in my assertion please send some time and educate me and Petri.

Share this post


Link to post

Mark, all you need to insert a 3D TIN is import the survey points (as 3D loci) -- for which there are several import options. IF the surveyor has included break lines in addition to the TIN points, then those break lines can be imported as 3D polys on the source data layer. Then you can build your DTM using that source data. A TIN is a TIN, pretty much. Most of the DTM software uses the same algoritm (2D DeLaunay triangulation) to build the TIN. You therefore don't need the TIN polys, just the points.

Share this post


Link to post

It is indeed. It is called 'real world data.' VW 9 cannot even create a DTM of less than 3000 points (after segmenting and filtering the data.) However, as usual, results are very inconsistent as some segments are OK.

A couple of years ago I had to resort to asking an AutoCAD user to help with a DTM. I can't remember the complexity, but it was too much for VW. With a $200 DTM add-on my friend generated the TIN as a matter of seconds, including slope analysis (which VW is supposed to do, but does not.)

Conclusion: VW's DTM is for small and simple projects and makes unrealistic assumptions of the source data (ref to inconsistency.) This would be fine, if it would be made clear to prospective buyers. The unreliability (reported by several users), however, is so severe that the whole product is hardly of merchantable quality.

Share this post


Link to post

Petri, while this may not help you in the slightest, I have to tell you that IMO the DTM in version 11 is much more functional and reliable than in previous versions. In fact, I never used it in versions 9 or 10 because it was so frustrating. That said, my projects are indeed on the small side, usually no more than an acre or two (and sometimes much smaller) pieces of land, for residential development/construction. I have had the most success by tracing contours (provided by a surveyor) and then converting them to 3d polys from which a terrain model is created. Once this is done (which is pretty fast) it is now actually possible, and in fact fun, to work with site modifiers and massing models. Sorry you're having such a time!

Share this post


Link to post

Robert;

The only problem is that there are many varieties of Delaunay triangulation algorithms for two and three dimensions. The particular algorithm used depends largely on the situation for example - is the terrain convex or concave? What is the shape of the enclosing polygon? and so thus some DTM code is ?smarter? than others code but no code will get it right 100% of the time. Each algorithm some of which are list for example:

? Octree= randomized insertion algorithm using an octree for locating the enclosing tetrahedron

? DeWall = Delaunay wall algorithm without searching structure

? Matrix = with sparse matrix

? Gitter = with regular grid

? Insert = randomized insertion algorithm

give very different results depending on the situation.

What I am talking about is that the surveyor is the one who truly knows how to form the triangles of an existing terrain ? Petri?s "real world data". It is he or she that knows which triangles need to be flipped and which don't. Which triangles are too thin and which are not. This is by reference to their field notes. Triangle flipping is a basis feature in almost all DTM packages yet VW11 DTM doesn't have this flexibility.

Share this post


Link to post

Well, a Delaunay triangulation is mathematically defined (and should be unique for a given set of 2D points), although there may be different coding implementations, which is what I believe you are describing. You'll notice I said "2D Delaunay triangulation" above. I'm not sure a "3D Delaunay triangulation" (as you suggest) is unique.

And if we were talking about survey data (sparse by nature), our DTM would not be having troubles! Petri's so-called "real world data" is not in fact survey data, it is high-vertex-count topo polygons derived from (probably) stereo photos. My objection is that the additional vertices are not adding any additional useful data.

By the way, if a surveyor has a "break line" he wants to include in the source data, he can include it as a 3D poly. He can mix points and edges in the source data. In this way you can specify edges, breaks, and ridges. I agree that triangle-flipping would be nice, but you can use this in the interim.

Share this post


Link to post

Robert,

OK, I've got to jump in here.

First of all, I appreciate the effort you and your staff expend in continuing to improve the product. For example, the improvements made to the Landmark user interface found in Vectorworks 11, are certainly a step forward. I have found the interface easier to understand, and therefore quicker to perform the iterative process required to get a usable DTM.

Yet, I continued to have some difficulties so I gave tech support a call. The response I received gave me some insight on how to get the DTM tool to provide more accurate results. I was told that one will experience better results by converting the 2d contours to 3d polygons instead of using 3d loci. This was what I had been experiencing, but I though maybe I was doing something wrong.

What I would suggest while the users wait for an improved DTM generating algorithm, is for Nemetschek to be more clear in the supporting documentation about what input methodologies work best with the current algorithm. Specifically in this case, tech support admitted that one would get better results by converting 2d contours to 3d polygons, instead of using 3d loci.

For what it's worth I've never been able to get 3d loci to generate anything but a DTM with all sorts of bizarre contours. In addition, based on my conversation with tech support, I've confirmed that a more accurate and robust model can be generated by manually interpolating additional 2d contours as source data for the DTM algorithm. By robust I mean a model that more readily accepts site modifiers without generating bizzare results. Based on some of the other posts, I don't think my experience is unique, and I'm sure others would appreciate some additional clarity from Nemetschek in this regard.

Finally I appreciate your efforts to provide perspective on the situation by reminding us that "it's only dirt". Of course what we are really considering here is the accurate placement of dirt, not the dirt itself.

Overall, I'm very pleased with VW 11, I think it provides excellent "bang for the buck" and I look forward to experiencing the fruits of your labor - continued improvements to the product.

John Brubacher

Mac OS 10.3.4

Share this post


Link to post

Robert;

A belated reply since I have been away. Although, I don't claim to be an expert in these matters my experience of using different digital terrain modelling programs tells tell me that something is eschew with the current Vectorswork.s DTM code kernel. As has been said by me and others the DTM module oftentimes gives unreliable; and to the uniformed dare I say, misleading information - be it in repect of either it calculated 2D or 3D data representaion. This cannot be user base mass delusion; it is not true to say that given certain data set that the triangulation (TIN) and hence the contour representation will be unique. Sparse data by definition must be interpolated and herein lays the problem with the algorithms currently in use ? the current implementation just does a bad job at interpolating and weighting the 3d point data. For example look at differences one can achieve using different algorithms.

figure10b.jpg- [/img]

Contours based on linear triangle based interpolation

-

Contours based on Clough-Tocher interpolation of the data points

-

Contours based on natural neighbour interpolation and extrapolation of the data points.

This tells the storey and I suggest your programmers read this article at this web address.

Share this post


Link to post

I use Manifold GIS for my DTM preparation, it is cheap and powerful and exports data in a format that is readable by Vectorworks. I have found however that Vectorworks V11 is easily overwhelmed by too many points, even quantities that GIS can process in a microsecond. Particularly deadly is pressing the undo button on a rendered DTM. One then has to sit and twiddle whilst the detroy list does its job.

For the programmers information, I see Vectorworks as a means of doing whole of subdivision land analysis. Absolute scales for this work vary from 1:100 to at most 1:10 000. At smaller scales GIS is the way to go. The size of the area to be modeled can be up to 5 km square (eg resort complex with a couple of golf courses). A surveyors detail survey normally results in a 15 m grid being place over the terrain, with break lines at the tops of banks and in creeks. This is the spatial resolution of the data.

GIS generally is not very good at working with models of structures such as buildings. Therefore I would prefer to use CAD for showing buildings placed in landscapes.

Can Vectorworks drap a photo over a DTM as a texture bed or perhaps just as a draped photo?

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×