Jump to content

DTM - the dark side


Recommended Posts

Have a file. Have a layer with a whopping 57 (fifty-seven) 3D polygons. I say 'Site Model,' get the dialog, answer the questions. Get a message: "This model contains no DTM, so the command was not executed."

Hello? I don't have a DTM, so I can't get one?

OK, so I copy and paste this MONSTER data set to a new file. I say 'Site Model' and everything looks cool. And still looks cool, after 20 minutes, 30 minutes, 40 minutes. The beach ball is rotating, so obviusly, something is happening. After 50 minutes, I force quit. Experiment repeated, but I only have time to wait for half an hour this time, before htting the keyboard again.

Have another file. Said 'Site Model', got a DTM. (Glory be!) Made some modifications. Tried to regenerate model. What VW now says is that 'NNA_Hidden_record is missing' and nothing happens. Copy and paste to a new file, and everything works again.

Hello? How did this cursed hidden record go AWOL?

This is ridiculous!

Mind you, with AutoCAD and a very low-cost DTM generator, I have quickly and without sweat generated DTMs with thousands, even tens of thousands, of 3D points.

VW 9.5 MacOS 9.2.2 G4 800MHz/768MB, 400,000 K allocated for VW

Link to comment

Well, this gets interesting. Since our last exitict episode, I found out that without a boundary, the 57 polygon DTM was indeed able to be generated.

However, more often than not, I would need to generate a DTM of a subset of the 3D data (as some of my data sets are several square kilometres and I only wand a few hectares.)

So, where are we in this respect?

Link to comment
  • Vectorworks, Inc Employee

A couple of observations:

1. A "3D poly" can contain as few as 3 or as many as thousands of data points, so they're not good indicators of the size of a data set.

2. As far as abstracting out a smaller data set, it's much easier to do when the data set is 3D loci rather than 3D polys. Then you can simply deleted data irrelevant to your current project. Since I know you like scripts, Petri, here's a simple script to turn 3D polys into 3D loci (there's also a command for this, but until the next release, it generates a bug):

{-----BEGIN VECTORSCRIPT-----}{This script converts 3D polys on the active layer to 3D loci}

Procedure P3D2Loci;VARWreckem : Boolean; LName:STRING;

PROCEDURE convert(h:Handle);VAR total,k:INTEGER;x1,y1,z1 : REAL; BEGIN total := getVertNum(h)-1; FOR k := 0 TO total DO BEGIN GetPolyPt3D(h, k, x1, y1, z1); Locus3D(x1, y1, z1); SetDSelect(lnewobj); END; IF Wreckem THEN Setselect(h); END; BEGINDSelectAll;Wreckem := YNDialog('Do you wish to delete the 3D polys?');LName := getlname(actlayer);ForEachObject(convert,(((L=LName) & (T=Poly3D))));IF Wreckem THEN domenutextbyname('Clear',0);END;

RUN(P3D2Loci);{------END VECTORSCRIPT------}

Link to comment

You don't address too many issues in you reply, Robert.

I agree that the number of polygons is indeed not a good indication, but in this case, I had 595 vertices.

The problem with 3D loci is that sometimes (eg in this case), there would be duplicate loci when two 3D polygon edges coincide. Yes, I have a script to filter out duplicates, but (i) not everyone has is and (ii) it is another step (unless incoporated in the conversion script - not a huge task, in fact.) I've done this conversion many times over the years, so I know what I'm talking about!

Creating a subset is often counter-productive: I may need several DTMs in a single project from one large data set (somewhat difficult especially with the current symbol-based approach) or may need to redefine the area of interest. The boundary is an excellent mechanism for these, if only it worked.

Speaking of scripts, I wrote my own crude and awfully labour-intensive 'DTM' system in 1994-95. One tiny component is this:

PROCEDURE CountVertices; { ? Petri Sakkinen 1995 } VAR t : LONGINT; h : HANDLE; BEGIN h:=FSACTLAYER; t:=0; WHILE h<>NIL DO BEGIN t:=t+GETVERTNUM(h); h:=NEXTSOBJ(h); END; MESSAGE(t);END;

RUN(CountVertices);

May be helpful for others struggling with DTMs.

Finally, the new DTM looks quite ambitious and useful and I hope it is improved soon to fulfill the promises! To me, these are early days with the new version (upgraded to VW 9 only two weeks ago.)

Link to comment

This bulletin board format is ridiculous! I composed a long reply in the postage stamp sized window, then tried to post it, got some message about 'flood control' and a 'waiting period between posts' - and lost all I had done.

After an hour, I still get the same message. Now, it is ten hours since the last attempt. Let's see what happens. (Or maybe I have been moderated?)

Anyway, I'll try again, now conveniently in BBEdit...

Right, where was I?

What you say about 'unusual' may indeed be true when you have someone to do a survey for a particular site, but we mainly work with data from other types of sources.

In this case, we have a very large data set, the entire Melbourne CBD, as 3D polygons. The terrain (lots, streets) in triangulated. We are adding a few sites and streets (an extension of the CBD) and have no data to speak of. Using some assumed datum points plus the known level of the river, we continued the same approach. However, we can't 'float' anything on this kind of model, so we need the DTM.

Anyway, we get a lot of data in this 'TIN' format - just last week, a fairly large area that was modelled in Studio Max. Some surveyors we have been dealing with also don't release 'proper' 3D data (wanting to do the DTM themselves).

So, to us, the unusual is usual and vice versa. Needless to say that AutoCAD has handled all kinds of data sets with grace, including attributed 2D contours (typical GIS system data type). Can't remember what the add-on is as we don't have it but get a friend doing the DTMs for us, but I believe it costs something like USD 200.

The resulting TINs are of course of little use - no floating, no proposed terrain etc. - but they are better than no DTM at all.

Link to comment
  • Vectorworks, Inc Employee

This is unusual data for site models, if there are large (or even significant) numbers of coincident edges. My experience is that 3D data is usually defined as a "point cloud" of 3D points (such as the data created by an electronic transit) or as "break lines" which could resemble 3D topo lines. Where is your data coming from?

Link to comment
  • Vectorworks, Inc Employee

Your explanation makes sense. If you are already getting information in a TIN format (which is of course the same TIN (triangulated irregular network) that the VW DTM produces, you'll need to write a script to convert the TIN to its constituent points. The script above will work, but you'll have to add a sorting algorithm to find and eliminate duplicate points (Probably the best method to do this is to put the points in an array and sort by X, then test Y and Z to see if they coincide). If you do this, you'll get clean input data for the DTM.

As I understand it, the DTM algorithms we use treat a polygon-edge differently from the two 3D vertices that define it. It treats this edge as a so-called "break line" and attempts to do additional data extrapolation. If you are inputting a TIN which is already optimized, there is no useful additional information to be gained from the edges, so they just create overhead for the DTM. Better to eliminate them.

I hope this explanation is helpful to you.

Link to comment

The 'different algorithm' makes sense and reminded me of the fact that when I was attempting to write my own DTM (before there was one), I indeed found two different algorithms from (as I seem to recall) the web site of the University of Christchurch, New Zealand. The academic paper was a bit too much to me, I must say...

Anyway, I have for years needed to filter out duplicate 3D loci, with the below script. No sorting, as I don't think it overall improves the speed that much. The logic is that one can actually see where the duplicates are (unless there are many!) and that may help in improving the source data.

Maybe this helps someone else.

PROCEDURE Duplicates; { ? Petri Sakkinen 1997 - 2002 }

VAR h : ARRAY[1..10000] OF HANDLE; obHd : HANDLE; xh1, yh1, xh2, yh2, xo1, yo1, xo2, yo2, z : REAL; i, j, n : INTEGER;

PROCEDURE MakeArray;BEGIN obHd:=FSACTLAYER; WHILE obHd<>NIL DO BEGIN h:=obHd; obHd:=NEXTSOBJ(obHd); i:=i+1; END;END;

PROCEDURE CheckDups;BEGIN obHd:=FSACTLAYER; FOR n:=1 TO i-1 DO BEGIN MESSAGE(n); GETLOCUS3D(h[n], xh1, yh1, z); FOR j:=n+1 TO i DO BEGIN GETLOCUS3D(h[j], xo1, yo1, z); IF((xh1=xo1) & (yh1=yo1)) THEN SETSELECT(h[j]); END; END;END;

BEGIN i:=1; MakeArray; DSELECTALL; CheckDups;END;

RUN(Duplicates);

Link to comment

Well, it turns out that it makes quite a difference to use 3D loci, instead of 3D polygon contours.

I was able to generate a DTM from 34,000 points, roughly 2 square kilometres of terrain. Unfortunately, in my strange world, this is not yet much. In a couple of weeks time I need to model significantly larger areas. It'll be interesting to see if the new symbol approach helps - theoretically it should make it possible to do the DTM in sections (each with own symbol name, perhaps even each in a separate file), but the question is how to the resulting jigsaw pieces join together.

Anyway, the above is a lot more I ever was able to do in VW 8, so I guess the money was well spent. (Both for VW 9 and a faster machine with more RAM. Much more RAM.)

Link to comment

My woes with the DTM continue - and are getting ever worse. Keeping a running log here, just in case someone else has either similar or totally opposite experiences.

1. The large DTM I managed to generate cannot be regenerated (I wanted to reduce the area). 'No DTM' and 'Error (9, 67)' alternate.

2. I put of lot of time and effort in making a simpler DTM for real-time walk-throughs. Filtered 3D polygns, edited all intersecting 3D polys, created a locus-based data set or 6,500 points. From this data set, I can't generate a DTM at all.

I have just, however, received a DTM of the latter data set, kindly generated by Robert Anderson, who says he had no problems. VERY strange! I can't expect Robert to do all my DTMs. (Although I guess it is an idea... we are short of staff & computers... maybe he could also lend me a hand with a small library design project... Hmmmm...)

3. Tried to use the original DTM in a compound model via WGR & layer linking, but the terrain came through in half-scale. Converted the DTM symbol to group and got it to work - but clearly, this is unsatisfactory as a long-term solution.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...