Jump to content

Petri Sakkinen

Member
  • Posts

    110
  • Joined

  • Last visited

    Never

Posts posted by Petri Sakkinen

  1. Otherwise, this would be easily solved with a script, but determining which point of the polygons should be kept in place is rather difficult. If centrepoint (of the bounding box, ie not the real centroid) is OK, here's the script:

    PROCEDURE ScaleMultiple; { ? Petri Sakkinen 2003 }

    VAR

    active : STRING;

    scalingFactor : REAL;

    i, n : INTEGER;

    selectedObj : ARRAY[1..1000] OF HANDLE;

    PROCEDURE MakeArray (h : HANDLE);

    BEGIN

    i := i + 1;

    selectedObj := h;

    END;

    PROCEDURE ScaleEach;

    BEGIN

    FOR n := 1 TO i DO BEGIN

    SETSELECT(selectedObj[n]);

    SCALE(scalingFactor, scalingFactor);

    SETDSELECT(selectedObj[n]);

    END;

    END;

    PROCEDURE RestoreSelection;

    BEGIN

    FOR n := 1 TO i DO

    SETSELECT(selectedObj[n]);

    END;

    BEGIN

    scalingFactor := REALDIALOG('Scaling factor', '1');

    IF NOT DIDCANCEL THEN BEGIN

    i := 0;

    active := GETLNAME(ACTLAYER);

    FOREACHOBJECT(MakeArray, ((SEL = TRUE) & (L = active)));

    DSELECTALL;

    ScaleEach;

    RestoreSelection;

    END;

    END;

    RUN(ScaleMultiple);

    [ 09-04-2003, 10:13 PM: Message edited by: Petri Sakkinen ]

  2. quote:

    Originally posted by jodawi:

    Often the objects will all come in as "by layer" which can map to "by class", so you could possibly just modify class attributes with a script.

    And perhaps the best way to do that is my free 'Cloning around' collection on VectorDepot.

    Once you have configured your class attributes (well, twice - for import & for export) you can save the settings to be used in new/other documents.

  3. quote:

    Originally posted by Yovav:

    In the past (8.x days) there was such a script and it worked pretty fine. During the years, and the upgrades, the script stopped working, and the guy who developed it doesn't deal with VW any more.

    Yovav,

    Email the script to me - I can probably fix the obsolete calls etc as a matter of minutes. Well, I hope so...

    The address is

    petri@4dform.com.au

  4. quote:

    Originally posted by Yovav:

    the file is opening without the "map fonts" option in the translation process, and the Gibberish is popping into the screen. It seems like the character map that Autocad uses in Hebrew is totally different from the one that VectorWorks, and other windows software, are using.


    Does the gibberish indicate that the Hebrew font in question is actually used? Do you have access to another Hebrew font? Does changing the font make any difference? Is it total gibberish or is there a pattern (well, there should be...)

    What I'm getting at is that it would be dead easy to write a script that 'remaps' ASCII codes in all text strings, ie. converts from one coding to another. A job for the local distributor of VW, perhaps?

  5. Well, then you can create two text objects, name them 'datestamp' and 'filename' and use this script.:

    PROCEDURE DateStamp;

    VAR

    obHd : HANDLE;

    dateNow, fName : STRING;

    BEGIN

    dateNow := DATE(2,1);

    fName := GETFNAME;

    obHd := GETOBJECT('datestamp');

    SETTEXT(obHd, dateNow);

    obHd := GETOBJECT('filename');

    SETTEXT(obHd, fName);

    END;

    RUN(DateStamp);

  6. DB,

    In my system, there is no font 6.

    You can either find out the font ID with this script:

    MESSAGE(GETFONTID('Chicago'));

    or, preferably, use

    fID := GETFONTID('Chicago');

    and then (after creating the text, ie. instead of SetFont in the beginning)

    SETFONT(LNEWOBJ(fID));

    SetFont sets the default font, which is not perhaps what you want. Same approach applies to size.

    However, there is a better approach altogether: a 'named object' ('Date Stamp') which is updated with a command.

    My free 'Print and Stamp" menu command at VectorDepot does exactly this when you use it to print. You can define the font, size, location, orientation, colour etc etc of the date stamp any which way and it will be updated. In addition to date, you get the filename.

    Not that I would use it myself: I have these as parts of my titleblock and they are updated via a menu command. This is even better...

    [ 08-22-2003, 01:59 AM: Message edited by: Petri Sakkinen ]

  7. Katie,

    I don't know how you test the program - I only use perceptive testing and according to that, many functions are much slower in VW 10. However, I have sent a sample file about a certain (raster image-related) issue to Paul Pharr some weeks ago - in that case, it is not about perception but something you can see (flickering while-U-wait, wait & wait). Now, if you add the speed loss of VW 10 to the speed loss of OS X, sluggish is an understatement.

    I don't use OS X but have still decided not to upgrade to VW 10 because of loss of speed and also because of some bad user interface decisions. While there are some attractive new features, for everyday work I would need to stick to VW 9 anyway, so spending big for a couple of fancy features and endless hassles do not appeal at all.

  8. As most DTMs, that of VW works out contours from the TIN-model (triangulated irregulard node) which 'simply' (the algorithms I have seen are anything but simple) connects the dots (3D loci or 3D poly vertices).

    What happens between points, is not known and linear interpolation is just as valid as any other. For the human brain (at least that of an experienced landscape analyst), it is possible to 'see' (=guess) the 'trends' of landforms, based on what is 'above' and 'below' any contour, but I'm not sure if I would like to have a stupid computer making these intuitive guesses - after all, the computer has not made any geomorphological field trips, done orienteering or tried to find wild mushrooms with a topographic map.

    So, to make the point: any other interpolation would try to 'add' something to the data. A good surveyor takes his/her spot levels in an intelligent manner and that certainly helps. Terrain modeling is not an exact science in any case.

    And now for something completely different: in addition to TIN models, there are also DTM engines based on the raster concept. With a raster (grid) it is indeed easy to traverse the data set along grid lines / raster cells and do 'trend' type calculations, more complex imterpolations / extrapolations and even 'typology' analyses. This, however, requires huge data sets, compared with the TIN approach.

    Whoa - I try to sound like an expert! That I am not. Definitely not.

  9. quote:

    Originally posted by bclydeb:

    Petri and Richard

    May I politely take you to task for hooting at those whose perceptions about what can be "afforded" are diffierent than your own.


    With all due respect, I have no idea what you are talking about. Vitanaut made the point about 'leaders' but does not show any signs of leading in any other compartment than whinging. You, on the other hand, made some very sober comments about appropriate software.

    [ 08-07-2003, 11:12 AM: Message edited by: Petri Sakkinen ]

  10. Well, volume is easy. At its simplest, you only need a one line script:

    MESSAGE(CALCVOLUME(FSACTLAYER));

    Mass - that gets more difficult. But if materials are assigned via classes (such as MAT-STEEL, MAT-CONCRETE etc. which I use for graphic attributes), a script can eg. read a data file for materials and find the specific weight of the material. For a limited palett of materials, you may even hard-code the weights in the script.

    Alternatively, a worksheet can be used in VW 10. In column one say eg =VOLUME(C='MAT-STEEL'), in the second column, enter the specific weight and in the third the multiplication.

    Having said this: I have not met too many engineers who would work in 3D or be even remotely interested in volumes. But then again, I am an architect so the engineers I know do construction-related work. 3D (not even in thinking) does not seem to be a pre-requisite for these engineering disciplines.

  11. quote:

    Originally posted by Vitanaut:

    Bclydeb,

    Thanks for the input. I am surprised by your argument however.


    Hey, Vitanaut - you said you don't even have full Acrobat. So, since you are either poor or stupid - or most likely both - don't expect the world to compensate for your shortcomings. Get a life so you don't need to attack people who try to help you. No soup for you!

  12. quote:

    Originally posted by Runtime Error:

    Anyone know where I'd find one of the scripts to add to saved sheets?

    Here:

    PROCEDURE ScaleLineWeight; { ? Petri Sakkinen 2003 }

    CONST

    factor = 0.5;

    PROCEDURE ScaleIt (h : HANDLE);

    BEGIN

    SETLW(h, GETLW(h) * factor);

    END;

    BEGIN

    FOREACHOBJECT(ScaleIt, (ALL));

    END;

    RUN(ScaleLineWeight);

    [ 08-01-2003, 08:38 AM: Message edited by: Petri Sakkinen ]

  13. While we are at it...

    The database structure is somewhat unfortunate. Instead of a single Latin Name field, it should have fields

    - genus

    - species

    - subspecies / variety

    - cultivar / provenance

    and a way of quickly accessing all entries of a genus.

    At least 'optional' short names would be advantegeous: I have implemented several plant database systems (relational databases) using species codes AND developed a landscape module of my own for VW, also using them to great efficiency, especially as comes to plant species 'tags' in working drawings.

    Since the idea is not mine*, I can even reveal it: the syntax is GEN.SPE, optionally GEN.SPE.VAR where VAR can be the hybrid species, variety or cultivar.

    So far, this has never failed, even in projects with hundreds of species & varieties. Fast, efficient, legible without redundancy...

    Phew.

    Also a means of specifying different types of stock of the same species (etc) is necessary for serious work.

    *) courtesy of Melbourne (AU, not FL) City Council's 'nurserycode' system

  14. Still to read the White Paper...

    However, as much as I would like to dynamic sections (and, and, and...), for the BIM concept this is not actually a crucial issue - maybe not an issue at all.

    From my point of view, the main issue is interoperability - and in this respect, VW fails miserably. The proprietary object technology and inability to use & modify generic formats (including statements such as 'STEP is dead', 'no-one supports Industry Foundation Classes'*, 'ODBC is not of great interest to us' etc.) are not good. MAYBE Autodesk, Bentley, Nemetschek AG (ie. the real McNemetschek) and Graphisoft are just toying with interoperability, who knows, but I have not seen any signs of even toying from NNA.

    That off my chest, I must also say that VW has heaps of BIM features already. Too bad that nobody uses them.**

    *) as in 'nobody expects the Spanish inquisition'

    **) ditto

  15. quote:

    Originally posted by jnr:

    Petri:

    (1) Do you re-import the data back to Vectorworks?

    (2) I would pay the difference to have the ODBC capability as it would save me a ton of time. People seem to spend a lot of time obsessing about the cost of software on these boards, yet it pales in comparision to the cost of labor.

    (1) Occasionally, but not often. In this respect, there is a fundamental problem with VW: entities do not have a persistent, unambiguous internal ID, so import has to rely either on named objects or geographic operands. The former are fine, but laborious to establish, the latter are unreliable and only useful in certain (simple) circumstances.

    (2) You would, would you? Well, for many years, I have tried to sell various VW add-ons (as have many others) with little success. The VW folks are notoriously price-sensitive: a few months ago, an independent developer wanted to find ou if there is demand for a product he had in mind (on VectorDepot forum). The response was: yes, we want that, but it should be free.

    Now, my offerings perhaps have not been relevant to a large number of people, but in my correspondence with people who appear to require a specific feature (such as export to MapInfo), mentioning the war, uhhm, the price, has always resulted in total silence in the other end.

    (Typos, typos and more typos...)

    [ 06-06-2003, 10:12 AM: Message edited by: Petri Sakkinen ]

  16. Is it a limitation, is it a bird, is it an aeroplane?

    The way it works is simply (?) this: a project that uses WGR, is a multi-user project, but for each data set (= VW file), there can only be one 'author.' When a file is opened, the user in question becomes the author an is the only one who can modify the file. After that editing session is completed, verified by closing the file, new data is available.

    This is not necessarily a limitation, but possibly a 'management procedure.' Would you really like to see every change, experiment, addition etc. by someone else all the time changing your working environment?

    Let's assume (for the sake of simplicity) that you are working on the furniture layout of a restaurant with a Swiss Democracy theme. Another person is making changes to the partitioning and floor coverings, while the Third Man tries to place the cuckoo clocks. You would never know where anything is. Now, the cuckoo clock is here, and in 15 seconds, it is there, only to be moved again. Meanwhile, you are trying to place the tables to align with both partitions and cuckoo clocks, but can't, becuse they are all over the place. And the same applies to your co-conspirators... where did THAT table go?

×
×
  • Create New...