Dieter @ DWorks Posted October 14, 2013 Share Posted October 14, 2013 Hi, While trying out the Python world in VW, I found out that VW caches the scripts as changes aren't seen immediately. Very annoying to always restart VW after each code change. Can we disable this caching? Quote Link to comment
Miguel Barrera Posted October 14, 2013 Share Posted October 14, 2013 I do not see this behavior as any changes are run immediately. The only difference I see is that I am still running windows 7. The "VS Compiler Mode" is still available but I do not know if this command applies to all scripts now or just VS as the label suggests. Quote Link to comment
JBenghiat Posted October 14, 2013 Share Posted October 14, 2013 New for VW 2014: Go to Vectorworks preferences, Session, and enable "Run Scripts in Developer Mode." This both suspends caching and shows warnings. Because it is a preference, it persists over VW restarts. -Josh 1 Quote Link to comment
Dieter @ DWorks Posted October 15, 2013 Author Share Posted October 15, 2013 I checked the preference, but still see the caching behaviour and I don't know why. I can see that it sometimes notice the changes immediately, but sometimes not. Quote Link to comment
S. Robinson Posted January 12, 2014 Share Posted January 12, 2014 I have seen this when importing modules from my linked external text file (in my case "_main.py"). The simple workaround is to explicitly reload() your module: import vs, modulename from imp import reload reload (modulename) Adding this at the top of _main.py has solved the problem for me. I suppose you would want to comment this out once you are done editing the module. 1 Quote Link to comment
Dieter @ DWorks Posted December 28, 2014 Author Share Posted December 28, 2014 Hi, sorry to bring this back up, but it's still not fixed. And reload works, but then you have to do it on every import in every file, wich is not plausable when you structure your work in a good way... Is there a nother solution to this? I wouldn't mind restarting VW if it just starts, but as it crashes on startup most of the time.... Quote Link to comment
JBenghiat Posted December 29, 2014 Share Posted December 29, 2014 Here is a post from the VS mailing list -- a full explanation of Python caching and its options: Python, typically, always compiles the py files when executing them. This means that it should always pick up changes that you make to the source files. However, in Vectorworks the python engine is not restarted, and it does clean up after each execution. This is done so different scripts don’t interfere with each other. This interferes with the python’s ability to pick up the changes and you might see the problem were you change source files and the change is not reflected. The root of the problem is the ‘import’ statements. You can try putting the ‘import’ inside functions: def execute(): import myLibrary import somethingElse As another alternative, you can try setting the following program variable to FALSE. This is a developer mode option to stop VW cleaning up the python engine and let python restart, so your scripts run in the same environment. This will allow Python to correctly pick up all changes. On the flip side, enabling this could cause crashes inside the python engine (that’s why it is pulled out as a separate option until python fixes this issue) varPersistentPythonEngine = 412, // Boolean -- when True the Python engine is the same for the execution of all scripts // this solves some issues with Py_Initialize and Py_Finalize // For example, when debugging externally python leaves threads that cause crash if Py_Initialize and Py_Finalize is used for each script call // So, this allows the engine to be preserved between calls, // however Vectorworks will delete all custom modules and objects defined in the engine prior each execution Quote Link to comment
Musisback Posted April 20, 2015 Share Posted April 20, 2015 I use a different approach to solve this issue. I created a command that clears all the modules in a designated folder. (In the example : myLibrary) It actually removes the modules from sys.modules. Python will therefor import them again at the next import statement. I find this much easier than placing "reload(module)" statements everywhere in my code. (especially since they need to be removed for deployment) import sys modulesToDelete = [] for m in sys.modules: moduleName = str(m) if "myLibrary." in moduleName: modulesToDelete.append(moduleName) currentModule = __name__ for mod in modulesToDelete: if mod != currentModule: # Python cannot delete the current module del sys.modules[mod] Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.