Jump to content
Sign in to follow this  

Usefulness of Vectorscript include files

Recommended Posts

Unless I am very much mistaken, the usefulnes of Vectorscript include files dates back to the dark ages.

I have over 25 years commercial programming experience and my language was predominately Pascal. I wrote huge applications and maintenance of these by many people was critical. I have even written several compilers based upon Pascal type syntax, none of them having a limitation such as I am finding myself in now.

I am now writing a noddy bit of Vectorscript and have just decided to split it up into functional parts using Vectorscript include files.

My requirements are very straight forward, I have an external script that contains come useful routines. These routines share some common type and var declarations. The whole lot sits in a vectorscript file called StakeTools.vss

Now, due to the apparent limitations of Vectorworks include files, I cannot so far fathom a way of including the external file, whilst breaking the integrity of the include.

Normally, when developing a software application, you create a number of modules. These are all standalone and the interface in and out is clearly defined. This is how I have structured my program. Each module would normally self compile to create an object code with symbolic links to external routines. You then use a linker to resolve these symbolic links and create an executable file.

Understandably, Vectorscript does not have the complexities of a linker, instead including source code in-line. However, I would have expected it to provide a way of including inline sections of code, allowing it to utilise modular include files instead of insisting on strict ordering of TYPE's, VAR's and main bodies etc.

Instead it would appear to enforce a strict program structure that will necessitate me to split my module scope types and vars out from the include file and including them in the main script in the relevant planes, completely defeating the usefulness of having include files.

Please someone say that I have made an oversight and Vectorscript will infact allow me to include a module that includes vars and types specific to the module.

Share this post

Link to post

You're correct. That's one of the things I don't like about how it works.

I only use include files for my own library routines. And for splitting up the project. I only put the methods in the includes that doesn't need to use the global variables.

Share this post

Link to post

Another possibility is to break your include into three files: Include, Include-Var, Include-Const.

Then you can include each part in the proper part of the main script.

Yes, this should be better and not be required. If you want to use this them please wish list it.

I have also had problems in the past using nested Includes. File Include2 includes Include3. File Include1 includes Include2. Code is Include3 is not seen in Include1.

Share this post

Link to post
Please someone say that I have made an oversight and Vectorscript will infact allow me to include a module that includes vars and types specific to the module.

I'm hardly an expert on $include (or programming for that matter), and I have a question about this.

If the include has no global variables, then as DWorks says it's good to go.

So you need to construct the 'modules' as fully formed procedures, complete with formal and variable parameters for passing data from and to the main script. Then in the main script you call your procedures from the include and use them exactly as you would the 'pre-made' ones we're all familiar with, using the globals declared in the main script.

Does this work as I think it might?

Can you have an include that's just a list of modules like:

Procedure DoSomething(data1,data2:Real; Var: result:Real);



so on and so forth;

result:= the ouput of all this;


What would be the limitations of this if it does work?

At what point does it become unwieldy?

Share this post

Link to post

Yes, $Include files that encapsulate all of the variables/constants inside of Procedures/Functions work fine. Just make sure you place them high enough in your code so they are compiled before they are called.

Share this post

Link to post

@ccroft: You're right, but like Pat says you'll have to be sure that they are included before they are called. Because of this, you'll also need to be sure that method's don't need other method's in other include files.

You can just make own methods on the basic methods provided by NNA and include them. Then you can use these like you would use tthe basic methods. Just put the include right after the var section.

Share this post

Link to post


I would consider include files more like libraries of constants and/or self-contained procedures and functions, which are typically grouped by subject. I believe this feature was introduced in VW11 to resolve the size limit of scripts. I only know this because I am guilty of hitting this mark and rather than splitting the code in parts, I condensed the code to stay within the limit.

The INCLUDE directive is just an external storage space for the code. At runtime, it will add the code at the line where the directive is called. This is why it is important to place the directive at the top of the script and to follow the language order of statements (CONST,TYPE,VAR,PROCEDURE and FUNCTION).

Share this post

Link to post

So if you have an include that lists a bunch of self-contained procedures as well as constants for use in other scripts, you might run into syntax problems in the including scripts, depending on where you place the directive.

Seems like a good enuf reason to keep your includes segregated by type of code: all procedures and functions, all constants and so forth.

Miguel, I think you're right about the when and why. I've never been close, so copy and paste has been my method for recycling code.

The other main use of include seems to be in developing .vso to bypass jumping in and out of the plug-in editor. That's really my only experience with it.

Gentlemen, I thank you!

Share this post

Link to post

Usage of globals should be kept to a minimum.

That indeed can cause you some unwanted mistakes when you -humanly- don't keep proper track of the values. As a whole functions are always preferable over procedures, should it be applicable.

But perhaps more intelligent people can keep track of them. I can't on large scripts.

Try to keep an universal library of sub-routines and link it only through relative paths. Don't ever copy and paste. If you copy and paste a routine, this action made it eligible to be a generic routine to be linked from an include file.

I let my library serve both PC and Mac and on two versions.

Keeping a general library has many advantages, not last that when you fix an issue in a routine, it gets fixed in all scripts using it. AND you get quite an overview of how it behaves across many different usages (since you are likely to have more scripts using it, otherwise you wouldn't have it as generic sub-routine).

Whatever the path of your includes should be, it is to be calculated from the plug-in file: the xxx.vsm or yyy.vst, or zzz.vso file. You can place an alias of the includes folder and this will work, no matter where your includes folder actually is.

Some of the includes file need the presence of others in order to work, so the sequence of the includes becomes relevant. Below an example in sequence from a large script.

It is very easy and agreeable to work this way.


A menu command: blablabla.vsm, in the plug-in editor you tell it that the script is to be fetched from an include file:

{$INCLUDE ..\..\z_VS_includes\z_blablabla.px}

In the plug-in editor you'll have nothing else.

Within the blablabla.px file the whole script resides. This px file also contains calls to your generic library of includes. For example:


{ some script-specific constants, then common-ware: }

{$INCLUDE ..\..\z_VS_includes\_common\_K_Dlog.px} { dialog constants }

{$INCLUDE ..\..\z_VS_includes\_common\_K_LB.px} { list browsers constants }

{$INCLUDE ..\..\z_VS_includes\_common\_K_Txt.px} { text constants }


{ some script-specific vars, then common-ware }

{$INCLUDE ..\..\z_VS_includes\_common\Utils.px} { whatever doesn't fit any cat }

{$INCLUDE ..\..\z_VS_includes\_common\Txt.px} { text manipulation routines }

{$INCLUDE ..\..\z_VS_includes\_common\FileIO.px}

{$INCLUDE ..\..\z_VS_includes\_common\Res.px}

{$INCLUDE ..\..\z_VS_includes\_common\Dlog.px}

{$INCLUDE ..\..\z_VS_includes\_common\LB.px}

{$INCLUDE ..\..\z_VS_includes\_common\XML.px}

{$INCLUDE ..\..\z_VS_includes\_common\Han.px}

{$INCLUDE ..\..\z_VS_includes\_common\Attr.px}

{$INCLUDE ..\..\z_VS_includes\_common\Color.px}

{$INCLUDE ..\..\z_VS_includes\_common\Attr_Dialog.px}

{$INCLUDE ..\..\z_VS_includes\_common\Math.px}

Share this post

Link to post

Im going to put in a wish list request so that ordering of CONSTs, TYPEs and VARs can be used out of order when an include file is used.

My situation possibly stems from developing large applications, where a whole host of middleware routines are used by many different sub applications. I tried to downsize this for my set of tools/commands and it just did not work in an elegant manner - splitting out types, vars etc becomes unweildly very quickly as the number of external 'modules' increase, not to mention the tools having to understand the implementation/structure of the module. Vectorscript is a scripting application, and I must not forget that, but it would be nice for it to allow for modular development. Having written a number of compilers in my life, if the compiler is written correctly, it should be relatively easy to relax ordering of these. This would then allow all logical/functional parts of the 'module' to reside together and the tool/main script does then not have to bother with the implementation of the 'module' - ie, whether it has CONST's, TYPE's, VAR's etc.

I very quickly fell foul of the current limitation.

I had a routine to sort a dynamic array of stake objects. In a perfect world, there would have been read and write access procedures and functions to this array. The array thus would have been local to the tool 'module'. But due to the non modular approach of Vectorscript, which I can understand as it is just a scripting language, the whole module is included so the dynamic array becomes local to the whole tool rather than local to the 'module'. Not a problem as I can still use the read/write access functions, but since the dynamic array now becomes global, it and its structure also become global. And it is these local definitions being forced global and thus having to follow the global ordering of CONST, TYPE, VAR that is causing the problem as they have to logically be removed from the 'module'. So relaxing the order of CONSTs, TYPEs and VARs would allow declarations to remain local to the module rather than global to the application, even though, in reality, the scope of the declarations would then be global.

Share this post

Link to post

You'd be surprised by how few people actually wish to change VS.

I share your pain.

BTW, with 16 years VS on the back I don't find that you have to use globals at all. You do have issues in dynarrays, yes, but that's about all. The rest doesn't need to be global.


Edited by orso b. schmid

Share this post

Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  


7150 Riverwood Drive, Columbia, Maryland 21046, USA   |   Contact Us:   410-290-5114


© 2018 Vectorworks, Inc. All Rights Reserved. Vectorworks, Inc. is part of the Nemetschek Group.

  • Create New...