Jump to content

Claes Lundstrom

  • Posts

  • Joined

  • Last visited


241 Spectacular

Personal Information

  • Homepage
    www.touchcad.com , www.lundstromdesign.com
  • Location

Contact Info

  • Skype

Recent Profile Visitors

3,681 profile views
  1. Affinity still has some catch ut to do in some areas. Been using Photoshop AI quite a lot lately for generating backgrounds, and that is an area where Affinity needs to step up. The spelling checkers also needs a major update for those of us not speaking one of the bigger languages. You can fix it, but it's primitive at best. I never liked Adobe Illustrator much, and therefore used Freehand quite a lot back in the late eighties and early nineties, when the rendering apps at the time where pretty much unusable. Now that Affinity Designer got DXF/DWG in/out that works much better than the notoriously unreliable equivalent in Illustrator, and importing PDF from VW works well, I have moved 100% to Affinity Designer. Being able to export SVG is also useful, since VW does not do this for some reason. All in all though, I really like the Affinity line, and use it quite a lot as a supporting unit in my workflow. They have evolved a lot in a short period of time, so most likely, they will have an impact as they close up on Adobe.
  2. I totally agree with Peter here, backgrounds are absolutely essential for most renderings, and especially for water. An important thing to know is also to consider the direction of the light. Lights behind the viewer tends to make water look gray and plain whereas the opposite makes it come alive making it shine and sparkle. The opposite of course applies for the object you visualize, so the trick is to find the sweet spot between those factors. Here are a couple of examples showing the impact of the background, where they very much set the mood for the entire picture on some more open water renderings.
  3. A good water texture is very shiny and reflective, has a bump map for waves, and that's it. It really doesn't have a color, as the color is always and almost a 100% reflections of the surroundings. I generated these three images in Photoshop AI as examples, and if you study the colors of the water surface, you will see that it all consist of reflections. It can be more or less blurred dies to wind and waves, but it's never the less reflections. The only exception is when the water is very clear so that you can see the bottom.
  4. The problem is in the receiving app, not in Vectorworks, as well as in other app I tried. If I import a file into Apple's USDZ viewers, it will come in being in scale, unless it's too big to fit into a technical boundary box that many of these apps have. If I import a say 50 meter object, it will scan down of crop the model. It you import a typical car, it will work. On the Adobe Areo side, it does not import in scale. You have to wing it, and if it goes beyond the box, it will start to crop the model.
  5. It's based on a 1968 Riva Aquarama, specifically based on Lamborghini founder Ferruccio Lamborghini's private Riva powered by twin Lamborghini V12 engines. The textures are UV-mapped from a series of pictures of the real boat. Here is a link for those interested; On product renderings of boats, you typically use comparatively flat water as too much drama on the surrounding tends to distract more than it adds to the picture. I guess the message is that it's something pleasant, not something dangerous.
  6. On the subject of AI, I played around a bit with AI in Photoshop. Generating full images generates very mixed results. Some ok, some terrible. It does however better when adding elements to the picture, and for extending a picture beyond the borders of the picture. In this example, I started with the rendering to the left, added some motion waves about the boat and a coast line far behind. In the right image, I extended the image on all sides to get more of an overview. The waves probably took 5-10 attempts to get the effect I wanted (Photoshop generates suggestion in batches of three). The coast line was ok within one try. The same applied to the extensions, though Photoshop seems to prefer extending in one direction at the time.
  7. This is a really really basic mapping that should just work if you ask me. I work with UV-mapping on a daily basis. It's an absolute must to have something that is easy to use, delivers perfect high quality results every time, with a minimum of fuss. Perhaps the developer team needs to sit down and do a proper update on these features.
  8. Which app do you use ? On a basic level, Apple's USDZ viewers in the iPhone/iPad and Mac Preview gets it right in scale every time, if you set the drawing units right, so it can't be that it doesn't work in principle. Adobe Aero does not, you have to set it up manually. So, there must be something else messing it up.
  9. Yes the bottom picture looks reasonable right, though it was a bit too low res, so I pumped it up with AI. Globe.vwx
  10. Something like this ? Note that water is highly reflective and does not have a color of it's own. The color comes from the surroundings like the sky, clouds, land in the background, objects in the water, depth of the wet and from the water itself if it's polluted by something. The main feature of a water consist of being very shiny and some degree of bump map, but no texture picture like for example wood or stone. The bump map can probably be reduced in size and file format (sorry but I usually need very high quality texture maps in my work). Water 2009 v2024.vwx
  11. Here is a small test on a similar room size. As you say, you can't order a prefabricated kitchen based on these scans, but I have found it useful in many ways, for making quick estimates on floor and wall areas etc, and you get a lot of details you risk missing by just measuring manually and using photos. The error margin in this case is just over 1 % compared to measurements using a Leica laser. I used the trick of just scanning the roof, as lots of for example furniture tends to mess it up, and of course hoping that the walls are fairly vertical. I also used the clip cube to trim off the edges in the corners, where scanners tend to create a "worn soap effect" (rounding off the corners), which makes it difficult to extract the true wall surface. I guess a few such basic measurements can extract a calibration factor the you can use on the dimensions used.
  12. Have you tried turning off simplifications in Scaniverse ? It may generate more points. Not sure more points necessarily means better result though. Simplify can also remove irrelevant stuff and make the files smaller. One limitation with lidar is that it only sees up to 5 meters, so there is an edge for photogrammetry in areas where you can't physically go a given point within the range. I have tried a few other apps that generates more points but I'm not sure it added much in quality. Generally speaking, I wish there was an efficient method to extract edges, as a few edge points is often better as a base for modeling. This applies to most of these apps regardless of method.
  13. Scaniverse is free app to start with if you want to experiment. It exports several formats that can be used with VW, both for mixing scans and CAD elements as well as processing with the terrain modeller points. It also exports USDZ models that can be seen in Apples' Preview app as well as augmented reality on the iPhone or iPad with no installation needed. Expect an accuracy at about 0.5%. The example is about 180 s q m and took about ten minutes to scan in one session.
  14. Why is it a problem ? I get correct in scale models using the free Scaniverse with both OBJ models and for point clouds with an accuracy of about 0.5%, so why would it not work with an app with a subscription ?
  • Create New...