Jump to content

Dave_Donley

Member
  • Posts

    63
  • Joined

  • Last visited

    Never

Everything posted by Dave_Donley

  1. Hello Shaun: There is a possibility that this technology will be included with 13, but the NNA policy is not to talk about or promise things for future releases. There are several reason why this is a wise policy. It would be much worse for us to announce that the next version will have feature X and for some reason we can't include it. Dave Donley dave@nemetschek.net
  2. Hi jaz1416: What specfically do you mean by adding diffusion? Is it like de-focusing the light or something? Dave Donley dave@nemetschek.net
  3. Line lights are similar to area lights, in that multiple light sources are automatically created along the light's geometry. Many more samples can end up being used than with a simple light type like point lights, for example. Area and line lights render slower when there are reflective surfaces in the model, like Phong, Mirror, Glass, etc. Line and area lights can render faster with radiosity than with raytracing, and usually render fastest with radiosity and minimal reflective textures. An extrude along path should be OK, it is usually better to keep geometry as high-level as possible for as long as possible, versus converting it to simple lower-level types of geometry. Radiosity especially works better with extrudes along path versus tiny 3D polygons. Reducing the geometry load, the number of pixels, and the cost-per-pixel (like with reflective surfaces and recursion levels) is the name of the game. HTH, Dave Donley dave@nemetschek.net
  4. Hello MikeB: This change will only make a scene render slower if that scene actually has reflective surfaces that would bounce between themselves dozens of times, which does not happen easily. This default was changed so that overlapping image props render correctly up to 32 overlapping props. Does your file have a lot of reflecting objects that would reflect each other in them?
  5. You should post this wish in the Spotlight forum so Kevin sees it.
  6. This image was done by using a rough bump shader with very small bumps. I thought you had tried that but re-reading your post it says you tried an image bump shader. Try either the wrapped rough or wrapped leather shaders with very small scale values - maybe you can get a decent look that way. I have done this before but there are limits to what it can give you; the smearing effect might not be enough to simulate frosted glass or a scuffed veneer or something like that. RW includes the "Industrial Toolbox" procedural shaders mentioned. We do not support the LWA archive format. This is a continuing wish. HTH, Dave Donley dave@nemetschek.net
  7. Hello azizg: Yes, this bug should be gone as of 12.5. Dave Donley dave@nemetschek.net
  8. Hello Hans: 1. This will be difficult to do, the best way would probably be to use an image of a brushed material for the color and the anisotropic, brushed shader for the reflectivity. The blurred reflections can only be done by what you've already tried. We have requested this kind of blurred relflection from LightWorks; they would have to create a new kind of shader to do it. 2. Create a texture using Translucency for the relfectivity shader, and set it to not cast shadows. Apply it to your lamp shade object. Any lights behind the shade will light it up. This looks best with a Sharp distance falloff (like in the real world); if you use sharp fallof you will need to use an emitter value for the brightness or bump up the brightness percent value much higher, and/or use the auto exposure feature in the Custom RW options. HTH, Dave Donley dave@nemetschek.net
  9. If the IES file has brightness information in it then RW can use it for rendering the light, but the value is not displayed in the Object Info palette. If you're talking about the light meter feature in Spotlight, I am pretty sure it doesn't pick up the custom light object values. I think the value assigned to the instrument PIO drives that number. Dave Donley dave@nemetschek.net
  10. Hello Charity: To use IES files to define the light's distribution, you must create a Custom light object. The Custom light object is the one that requires an IES file to define it. It is the last kind of light in the mode bar for the Light Tool. HTH, Dave Donley dave@nemetschek.net
  11. Photomatix has an unwrap mirror ball facility, which works well and is free! http://www.hdrsoft.com/ It should be pretty easy to set up a 100% mirrored sphere, a RW camera with a 1:1 aspect pointing at the sphere with a narrow field of view (making the sphere pretty small can accomplish this), and a 2D unfilled square around it aligned with the camera view's perspective crop rect to make the export marquee snapping more accurate. Put these things in a reusable symbol that you could drop into any model, change to the camera's view, render in a RW mode, and export the HDRI. Unwrap the mirror ball to lat/long format in Photomatix, then re-import as a RW background for use in other projects. Dave Donley dave@nemetschek.net
  12. Hello Gideon: You are correct about what HDRI files are useful for. As the HDTV format adds more pixels to a TV image, HDR makes each pixel more "deep" in its color resolution. The main use of this added color depth is to capture the wide range of brightness that occurs in the real world. This is accomplished by capturing multiple exposures of a scene and stitching them together into a single "deep" HDRI file. You have to use a separate application like HDRShop or Photosphere to do this stitching. Note that you can use the HDRI file formats (HDR and OpenEXR) to store 2D images, i.e. non-panoramic images, but almost all HDRIs are used as lighting sources or backgrounds for 3D renderings right now. The panoramic HDRI can be used as a 3D background that surrounds the model (it's like putting a huge textured sphere around the model), for generating a bunch of directional lights at the pixel locations, or both. You can set the layer to use an HDRI for lighting and a plain color for the background, for example. 1. LightWorks automatically detects and determines the format from the locations of the black pixels in the imported HDRI file. If the black pixels form a circle around the image then it will be set to Angular. If there are black squares making a vertical cross then it is that type. Otherwise it is Lat/Long. These are all ways to map a 3D textured sphere to a 2D plane. Lat/long is probably the best because it uses all the pixels and the distortion is limited to the poles. Note that you cannot create a true Angular HDRI without some other package like HDRShop ($), because a rendering of a mirrored sphere is not the same as the angular format, even though they look similar. 2. I think what MikeB is talking about is to use an HDRI to handle the "sky" part of the lighting and a regular directional light to handle the "sun" part. This is a good and appropriate use of the feature. If any very bright spots are in the HDR image then they will by nature become sun-like directional lights, but it is a common technique to separate out these very bright sources from the HDR image. Paul Debevec (the father of this stuff) used conventional light objects for several of his HDRI movies; the Parthenon movie (http://www.debevec.org/Parthenon/) used a regular directional light for the sun and the FiatLux movie used conventional area lights in the stained-glass windows (http://www.debevec.org/FiatLux/). 3. The imported HDRI images assume that you are looking at it from a front view (0 degrees means front), so if you are exporting then importing a VW-generated HDR image set your view up that way when exporting. Hello Hans: 1. If you want to export a rendered environment and use it for an HDRI, the easiest way would be to create a 100% reflective sphere (the Mirror reflectivity shader with the Mirror factor at 100%), and a camera that aims at it with a narrow field of view and square 1:1 aspect ratio, then render and export that as HDRI using the marquee option. This will produce a "mirrored ball" format HDRI image. However, LightWorks does not support that format directly yet. You could change the mirrored ball into a true angular format using HDRShop, but that will cost $ as the free version of HDRShop doesn't read EXR format files. If you only want a single color or if you do not care about the distortion that will occur from using the wrong format, like if you are only using the HDRI for lighting and not for a background, then you will be fine saying you mirrored ball HDRI is an Angular format image. If it is rendered as a background you will see some distortion because it is not really an angular format. 2. You can also create a image that is twice as wide as it is tall (1:2 aspect) and tell RW that it is a Lat/long format image. 3. You are correct in that there is excessive noise when using HDRIs for interiors. For now it is recommended that they be used for exteriors, or for partial interiors where the light is not trying to thread its way through narrow holes in the model. HTH, Dave Donley dave@nemetschek.net
  13. Hello David: The Filtered Image color shader will let you tint an image (grayscale works best) with a color. You should be able to duplicate the existing texture and change the color shader to a filtered image shader and re-use the existing image as a start. If you choose the Obj Fill option you can use the textured object's fill color to tint the image. Dave Donley dave@nemetschek.net
  14. Hi MikeB: Yes, one of the examples is just a plain single colored sky, with no bright spots in it. Dave Donley dave@nemetschek.net
  15. Hello ErichR: Open a new file, go to View->Rendering->Custom RenderWorks Options. These are the same setting as FQRW. Dave Donley dave@nemetschek.net
  16. Hello MikeB: There is a simple sky dome HDRI resource in the "Default" RW backgrounds file, which was missed from the 12.5 distribution. You can dowload it here: http://www.nemetschek.net/downloads/renderworks/index.php
  17. Hello Bryn: You can render as a viewport and have control over the sheet layer DPI and not have as many rerenders happening. The old skool way is to use the Render Bitmap tool, which has the same DPI feature. The DPI is relative to your page - so a 1" x 1" bitmap at 300 DPI will be 300 pixels wide by 300 pixels tall. Three to five hours sure sounds excessive even on a G4. There must be some geometry that is overly detailed or is there a lot of reflecting glass surfaces?? To tune geometry, one thing to try is to look at the model in wireframe, and zoom in on the parts that are black. Sometimes excessive geometry will be so detailed that the wireframe view of objects appears as if it had a solid black fill, because there are so many lines. Are there a lot of sweeps with very small increment angles? Is this geometry imported DWG meshes? BTW on the Mac you can select and copy a rendered viewport (or rendered bitmap), then paste into the Preview application (File->New From Clipboard), and save it out as a file. If it has problems coming in, set the viewport fill to solid instead of no fill. HTH, Dave Donley dave@nemetschek.net
  18. Hello Mark: Are your layer options set to Show Others? The layers will not render interactively unless the layer options are set to Active Only. You can also try changing the Responsiveness slider value in the VW Prefs dialog toward the Detailed side. Dave Donley dave@nemetschek.net
  19. Hello Cathal: There is not a 3D Studio Max plugin like the one for C4D. The best thing to use would be the 3DS export feature in VW. Slainte, Dave Donley dave@nemetschek.net
  20. Very nice quality textures too: http://www.psicosonic.net/?zona=00products&cat=02textureszona Dave Donley dave@nemetschek.net
  21. Hello domer1322: 1. There isn't a way to adjust the location of the HDRI sphere in z. 2. How big are the HDRI files? Larger HDRIs can run out of memory when rendered on Windows, less often on the Mac. We hope to alleviate this memory usage problem in a future release.
  22. Hello Hans: Are the 3D polygons non-planar? RW will not render non-planar polygons well. Do the 3D polygons overlap the other objects? That may be the reason for the artifacts - the renderer will render one object the other in a random fashion because it is not well defined which one is in front and which is behind. Usually this looks like a "comb" effect.
  23. Hi Mat: I think the lines are kind of confusing. The ceiling looks way cool! Maybe if there was something to reflect in the exterior surfaces that would add interest? I'm thinking about an HDRI environment separate from the lighting (you can control them independently through the Lighting Options dialog). HTH, Dave Donley dave@nemetschek.net
  24. Very Helpful, George. Please post any others that you come across!
×
×
  • Create New...