## Recommended Posts

Hello

For certain reasons I am "misusing" VectorWorks for some GIS related tasks. So I have to find for a hugh number of simple 2D polygons all possible 2D points on a certain layer that are inside each polygon - the so called "point in polygon" or "PIP" problem. My VectorWork python script using the "vs.FindObjAtPt..." functions is good for a small number of polygons and points but frustrating slow for a large number of objects to be matched/tested – I am siting in front of the computer screen now for more than 72 hours, waiting and without any hope and no idea (for that fact that the vs.message fuctions still does not work in VW2020) when the script will terminate. So my idea is to "outsource" the hole PIP processing to adequate python modules like Geopandas, where solving diverse PIP problems are representing standard functions.

Thus has someone already some experience or best practice how to "poke out" (transfer) most efficiently and fast the geometric VW object data inside of a VW pyhton script to a Geopandas data structure (data frame) in the form of e.g. for a polygon (x0 y0, x1 y1, x2 y2, ... , xn yn). The question is related to VW given functions to access the object geometric informations hopefully a single step (function call)?

Many thanks in advance for any hint and best regards,

relume

If I am understanding your question correctly, you're wanting to grab the polygon vertex information, and transfer them to a pandas dataframe?

If so:

1. You run a criteria to collect all the polygons that you're wanting

2. Then use vs.GetPolylineVertex(obj, vertexNum) to grab the coordinates of each vertex, collect those vertices coordinates and place in dataframe.

I will have a go at it later this evening..

```import pandas as pd
import vs
import os

layer = "AC_Parcel_polygon"  # pass in whatever layer name your polygons are on
criteria = "(((T=POLY)|(T=POLYLINE)) & (L='{}'))".format(layer)  # criteria string that finds polygons, polylines on layer=layer

polygons = []  # list to store polygons

def collect_polygons(h):
polygons.append(h)

vs.ForEachObject(collect_polygons, criteria) # use ForEachObject to collect polygons

# code to store polygon data in pandas datatframe,
# Format=

# row_index, Polygon Index, Polygon Vertex Index, Polygon Vertex Type, Polygon Vertex Radius
databased_polygons = []
for i, poly in enumerate(polygons):
vertices = vs.GetVertNum(poly)
for v_index in range(vertices):
vrtxPt, vrtxType, vrtxRadius = vs.GetPolylineVertex(poly, v_index+1)

columns = ["Polygon Index", "Vertex Index", "Vertex Type", "Vertex Type", "VertexRadius"]
dataframe = pd.DataFrame(databased_polygons, columns=columns)

script_path = '' # enter dir path here to store csv file
dataframe.to_csv(os.path.join(script_path, "Polygons.csv")) # store dataframe in csv for external checking

print(dataframe)
raise Exception # raise exception to see print data in error console in vectorworks```

Hello

I hope the dataframe populating process with every single polygon vortex will not be slowed down by the VW functions calls - but it is the only way to do it.

best regards

## Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible. ×   Pasted as rich text.   Restore formatting

Only 75 emoji are allowed.

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×