Jump to content
Daniel B. Chapman

Feature Request: NDI Support

Recommended Posts

I've been recently using NDI in my previz workflow to pass streams from machine to machine and it is definitely easier than any of the other solutions I've worked with recently. Can I request that we get this into Vision to replace the clunky video capture problems? 




Here's a link to the open frameworks plugin which gives some idea of how easy this actually is:



I think the receiver code (video source) is less than 600 lines of code interfacing with the SDK

Share this post

Link to post

Hey Daniel,


While I must point out that, integrating something new into existing code is rarely as easy as it may seem, this is interesting. I've seen similar in use with large video walls in the past. Which media servers are using this protocol I haven't looked at them in a while so I'm a bit behind the times.

Share this post

Link to post



As a developer myself I can understand the hurdles. I'd also like to point out that video input has been broken in Vision 4, Vision 2017, and Vision 2018 with some, but not all, capture cards working in Vision 2.3. I have (4) computers that I can test against (a surface, a MacBook Pro 2016 Touch, a MacBook Pro 2010, and a box I built myself running windows 10 that's nothing fancy) and none of these have working video input. Mac OS ends up the closest with some bizarre pixelated green/magenta image of my webcam.


Considering the fact video is broken in Vision as it stands you guys might want to explore "universal" connectors like NDI as it seems your ability to support hardware is hampered at best. I could be wrong and a video patch is on the way but I downloaded the Vision 2018 demo today and I get the same broken interface with no available inputs for video projection.


While I'm on the topic it would be nice to see you integrate:

  * Blackmagic inputs (the SDK is pretty well documented)


  Why? Because they are solid affordable capture cards that run on Mac and Windows 


  * Syphon inputs on Mac (again, very well documented software and a simple texture-share given the fact you're rendering in OpenGL this is borderline trivial)


 Why? Because it is native to Mac OS and used in the VJ pipeline by basically every media server. This bypasses the hardware issue entirely so it seems like an obvious choice.

 * Spout inputs on Windows (which is basically Syphon for Windows)


Why? Because it is Syphon for Windows. It bypasses the hardware chain entirely and is a simple shared texture.

 * Support for Virtual Webcams in general which would render all the above moot as you could use Syphon or Spout to make virtual inputs



 something like NDI which has converters for Blackmagic, Syphon, Spout, Capture Cards, Web Cams, Other media servers and CITP protocols. Since this runs of TCP/IP it also renders the hardware question moot. As a software company the phrase "renders the hardware question moot" should be of interest to you.


I've been fairly patient on this issue but I'm currently evaluating Capture Nexum and Realizzer for purchase in my next project because Vision is broken for video. I've already suspended my subscription until this issue is fixed. And I suspect that will mean I'm running Vision 2017 for a very long time at this rate.


"Which media servers are using this protocol I haven't looked at them in a while so I'm a bit behind the times."

This isn't really a valid question at this point given the state of projection design. Not everyone is running a Hippotizer or a D3 on gigs. Many of us use Watchout, QLab, Resolume, or a variety of other pipelines (Touch Designer/Isodora) in our work at the regional level for theater and dance and the inability to plug anything into Vision is extraordinarily frustrating. 


  • Like 1

Share this post

Link to post

** by "not plug anything into" I mean I literally can not purchase a capture card that will work with Vision 2017/2018 on Windows 10 or Mac OS and render video inside Vision. This isn't a case of me trying to skirt a hardware requirement. I would happily pay $300 to get this working right now and the answer is I can not.

Edited by Daniel B. Chapman

Share this post

Link to post



Believe me I understand your frustration. I probably get more upset than you do, when vision falls short. I along with others have years of our lives in this product, so it’s something we very much want to be good.


As for what media servers support the protocol, that’s my fault. In my head, when I said media servers I was including software with that. What I’m really asking is, how many people would I make happy if we put the time into this, over fixing the existing system. We only have so many hours in the day, so we want to make the most people happy with that time as we can. To justify putting time into changing the existing system, I have to be able to say I’m going to make more people happy adding the new thing, than I would fixing the old thing.


Honestly, I like the idea, but I do have some concerns. Glancing over their website I saw they have a VLC plugin, which is super awesome but, their download links are broken so I couldn’t check it out. Same thing for the SDK. If you happen to have good download links to those, please DM them to me so I can give it a look. One of the major things I’m concerned with is you said it uses converters to deal with other protocols. Are these converters hardware or software? How much do they cost? Who makes them? With capture cards there’s lots of venders and price points, can we say the same thing about NDI?


These are all questions we have to ask when we’re trying to figure out where our time is going to be best spent. So, the more detail you can give me, the better.

Share this post

Link to post



I think the general problem with the way Vision is attempting to handle video is that you are not providing solid hardware support and the only reliable support seems to be CITP which is pretty similar but far more complex to implement for most applications because it manages a lot more than video.

NDI's SDK is here. I have not personally coded up an implementation (though I'm probably adding it to my media servers after this opera). It is really just a UDP video stream that you can tap into.


Essentially the pipeline is just:
[NDI Stream Name]::[Frame #][some headers with sizing][pixels]

Since that's pushed over IP it is available on local host or on your network. It has a couple little preview image things that get pushed but in general you just pick a name and then decode it on whatever device you want.


Here's a link to two utilities I use because I'm using OpenGL calls most of the time:

Windows I just take a Spout broadcast and throw it over the network


On Mac I run


which is basically the same thing.


This takes the video hardware entirely out of the chain because you're taking a texture and mapping that to a video stream that is decoded on the other end. It really is pretty simple, especially if you are (and I assume you are) just taking a texture and mapping that to your light source for video. 


As to your concerns about converters: yes, that's a problem but it is significantly less of a problem than trying to support all the native capture cards out there (and it is clear your native hooks do not work at the moment). The IP solution bypasses this because there are a variety of utilities to get the format into NDI. There's a "pro version" here for Blackmagic cards:

https://itunes.apple.com/us/app/ndi-source/id1097524095?ls=1&mt=12 which makes some pretty cheap USB3/Thunderbolt hardware available or you.


Frankly, I'd rather see Syphon/Spout connectivity, but I operate at a pretty high technical level for video/projection design as I write my own software for most of it. I think having some simple connectors available would be a very quick way to get around this problem. I suspect NDI is going to be a standard that sticks around for quite a while. I first used it on a major corporate event and I'm seeing it everywhere now. 


At the end of the day I don't actually care how I get a video stream into Vision I just need to be able to get a video stream into Vision. NDI makes multiple inputs a lot easier and in many cases you can completely bypass a capture card with software. That being said, running a HDMI cable out of a Mac into the back of my PC wouldn't really bother me if it meant I could render projection mapping effects easily for a director. In many ways I think supporting Blackmagic devices would be a good choice as they are relatively low cost (Intensity Shuttle) and the SDK is available for Windows/Mac so you don't have to do the hard work or depend on the operating system.


At the end of the day I just want to get a video source into Vision. 2.3 just doesn't cut it for professional renderings (and I did make that work with an AverMedia PCIe card). 

Edited by Daniel B. Chapman

Share this post

Link to post


I totally get it. There's like to have and need to have, and we need to have video steaming back in vision in some form, we're on the same page there. Honestly, I'd love it if we could support both or everything really, but again there's only so many hours in a day. On the up side NDI is cross platform, and at least on the concept level it seems simple enough. The converters though may be a little harder sell, if we looked at this as a full replacement.


As for spout and syphon, we've been asked several times about them. Spout is in a lot of stuff now, so it would be cool to support it but, it's windows only. Same thing for syphon. Everything in vision is cross platform. As it is now, I don't know that it would be considered because of that, I could be wrong though. If there was a library that rolled them together so we could use a single call for either in vision, that might be something to be considered.


On the up side, the library we use for capture actually has support for a lot of hardware, including black magic. It's just a matter of running down why it's not working. Which we are working on. I've personally put in more hours than I can count looking at this one thing. That being said, while I can't promise anything, I will be doing some deeper research into NDI once I get their SDK.

Share this post

Link to post



I'm actively using another product for previsualization right now and the NDI workflow is extremely easy. I'm using Open Broadcast Studio to screen-capture media server output and send it over the network so there's actually no hardware in this pipeline. NDI has solid support in Open Frameworks and several capture cards (Blackmagic) have utilities for reading directly into a stream. It is easier than dealing with HDCP issues between devices and is simple to use. I'm able to easily stream two SVGA resolution outputs over wifi with no hardware.


If you opt to not support it at a minimum it would be nice to see support for Windows Media Foundation compatible cards and AV Foundation compatible cards on MacOS. If you supported the core frameworks I suspect we would all have a much easier time pulling in video.


Share this post

Link to post

@Daniel B. Chapman

Keep an eye on this thread: 

We will be keeping it up to date with the latest information. I would hold off on buying capture cards until we can make some cheaper recommendations (a $3k capture card is unrealistic), but at the very least if you have a capture card already, you can verify that it either does or does not meet the requirements we believe are necessary for capture to work properly.


And to address your last post a little more directly, it appears that so long as your capture card supports UVC and YUYV422/YUV420P, then we will use Windows Media Foundation and AV Foundation libraries to get them to work with Vision.

Share this post

Link to post

Ok, can someone please explain to me what I'm doing wrong here? 



I created a basic 3D polygon, assigned to to Universe 1.1 the name of the (NDI is actually SyphonNDI). I reset my Vision Video sources and...nothing. The back of the texture is also blank.

Quick Edit: all my other NDI tools on this workstation are fine. I'm actually previzing this project in Capture right now...

Edited by Daniel B. Chapman

Share this post

Link to post
On 9/15/2019 at 5:55 PM, Daniel B. Chapman said:

NDI is actually SyphonNDI

Daniel, can you tell me more about Syphon?

I’m starting a project right now that I’m hoping to utilize NDI streams for applying content to LED Screens. 

Share this post

Link to post

@Charlie Winter


I'm running QLab and some proprietary stuff for most of my designs and Syphon is a way to share a texture between applications on Mac. I output Syphon rather than a video surface and then use programs to stream that over the network or share with another program. Spout is the Windows equivalent. You might want to look into Open Broadcast Studio (https://obsproject.com/). They have a good NDI plugin where you can stream arbitrary content. 


For a small video wall it should work fine, but for larger surfaces you need some serious processing power and very fast networks to make it work. I generally use NDI for the previsualization side of things as I'm running at half or even quarter resolution.

Share this post

Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


7150 Riverwood Drive, Columbia, Maryland 21046, USA   |   Contact Us:   410-290-5114


© 2018 Vectorworks, Inc. All Rights Reserved. Vectorworks, Inc. is part of the Nemetschek Group.

  • Create New...