Current projects, works in progress

  • Poser Ambassadors

    Styling dynamic hair with magnets. (and some minor face morphs)
    Colors come from a ColorRamp node in PhysicalSurface as shown some time ago.
    SuperFly 20 samples.

  • @seachnasaigh I want to build a Tron suit for Dusk and maybe one for Dawn. My question though is about the lights on the suit. Can I get that effect with just a material zone or does it have to be a modeled detail in the suit? I'm think a material zone with a emitter node attached?

  • Poser Ambassadors

    @eclark1849 In Superfly, you could do it even if it were all one material zone, using a discriminator mask. Firefly generally needs an emitter, so it may not work very well in Firefly.

    If you assign the glowing polys as a separate material zone, the nodework will be simpler. Easily done for Superfly.

    If you want similar results in Firefly, I'd recommend making an emitter mesh which would duplicate the glow polys but be raised a bit from the suit surface. This emitter would need to be a separate (dynamic?) prop or be a separate figure part, or be a separate conformed figure. That's because you need to be able to set the emitter's properties to be not visible in camera and not cast shadows.

    The old Tron had quite saturated red or blue glowing trim; your screenshot's glow strips look white. Do the glow strips change color?

    In any case, I'd be glad to help with that.

    Server rack project status: rack is completely assembled and cables all connected, but I'm still working on the household 120V AC power upgrades.
    So, rack -and workstations- are ready, but no power available yet. The only PC I have working is little Urania, at the kitchen diner booth.

  • @seachnasaigh said in Current projects, works in progress:

    During the winter, just shut off the furnace and use the servers' heat.

    All that heat generation would probably require a bucket full of electricity.
    Have you figured out what it would cost to run the rack for a day (24 hrs)?

  • Poser Ambassadors

    @krios said in Current projects, works in progress:

    All that heat generation would probably require a bucket full of electricity.
    Have you figured out what it would cost to run the rack for a day (24 hrs)?

    I haven't made electrical cost per time measurements, but the last time that I was running atmosphere renders for five days on the previous rack, I recall that the electric bill was about $150 higher than normal. So, that would imply about $30/day when rendering nonstop.
    Before anyone clutches their pearls and gasps, I'll point out that [1] running the rack for that long is only an occasional thing, and [2] it would cost you more to do the same amount of rendering, because the blades are more efficient than a workstation or laptop. It would just take you much longer to get that done.

    I do know how much electricity (in Amperes and Watts) each type of computer uses, measured with a line splitter ammeter.
    0_1510234615410_ammeter to measure PC current draw 1200x700.JPG
    This information may be of use to @momogun for planning purposes:

    • Dell C1100 blade w/ 2x X5650: 2.25A @ 125V AC => 270W

    • Dell r610 blade w/ 2x X5690: 3.35A @125V AC => 400W

    • HP xw6600 w/ 2x E5430 (old HarperTown simple {non-H/T} quad Xeons) and two Quadro GPU: 2.80A @ 125V AC => 336W

    • Alienware Aurora w/ 2x X5690, liquid cooled, Quadro FX5800 GPU, 2x 30" 2560x1600 monitors: 6.20A @125V AC => 744W

    I do have plans to run the rack shortly after I get the household electrical service upgraded. I have a list of spiffy -but tough to render- atmospheres (spherical panorama sky renders) I intend to render in Vue Infinite 2014 via the network renderer HyperVue. I already know that some of those will take days to complete.
    These will then be released for use on the P11 construct, Snarly's dome, or my Lothlorien enviro's skydome.

  • Poser Ambassadors

    Electrical consumption measurements taken at 100% CPU with their internal cooling fans running fast.

    Note that the r610 server blade and the Alienware workstation have the same processor package (two X5690 Xeons). The blade uses less electricity because it isn't driving a video card, big monitors, etc. as is the workstation.

  • @seachnasaigh
    You can get accurate measurements with that line splitter ammeter and this to get a ballpark figure:

  • Poser Ambassadors

    @momogun For the shared monitor for server blades, I recommend a 19" 1280x1024 pixel monitor with a D-sub (VGA) port. The blades' onboard video chip is minimal, and 1280x1024 is the maximum resolution that they will display.

    Matching monitor native pixel resolution to the video chip's resolution gives the best clarity, so, a 1280x1024 monitor for a 1280x1024 chip. Most monitors of this resolution are 17" or 19".

    The other resolution options which both the C1100 and r610 blades offer are 800x600 and 1024x768. The r610 also offers 1152x864.

  • Poser Ambassadors

    I finally have new 20A AC power receptacles/circuits for each workstation and for each quartet of server blades. The next step will be to power on all of the machines, then run a full power render test.
    Well, less workstation Cameron, as she has either a bad power supply or a motherboard fault. So, that's another bit of work to do.

  • @seachnasaigh
    Would it be possible to "cluster" your blades so that it would act as one giant CPU? Or for that matter, cluster any number of computers or laptops to act as one computer? Would Poser or the QM work with a system like that?

  • Poser Ambassadors

    @krios I think there may be a way to do that, by setting up a virtual machine, but I've never tried it. @shvrdavid knows more than I do about it.

    I use my network for Poser, Poser>Reality>Lux, and Vue. Lux and HyperVue are both capable of spreading a tough render across the network; my top wishlist item for Poser is to be able to network out a Superfly (or Firefly) render.

    I've powered up the entire rack (now fed from those new 20Amp electrical outlets), and I'm updating Windows, security software, etc. Everything looks good so far, on the Pixie Hollow network. :D

  • @seachnasaigh
    Thanks mate, and congrats on a successful boot-up! (keep an eye on the electrical bill)

  • @seachnasaigh Cops gonna bust down your door thinking you have a "grow" farm instead of a render farm with all the electricity you're using. LOL

  • Poser Ambassadors

    @ghostship "Nobody in here but us pixies!"

  • @seachnasaigh living the "Virtual Life". Retired to the Render Farm ;-)

  • @seachnasaigh thanks for this info.. Now i know Dell C1100 electricity more cheap than others.

  • @seachnasaigh owh i see. How about dell R510. Do you think the resolution same with R610?

  • @krios ha ha

  • @seachnasaigh Why don't you share some pictures. Please..

  • Poser Ambassadors

    @momogun The C1100 blades use less electricity than the r610 blades only because the r610 blades have faster processors (2.66GHz X5650 in the C1100, compared to 3.46GHz X5690 in the r610). Both processors are H/T hex-core, so same number of render threads, but the X5690 will be faster.
    Note: Not all C1100s will have dual X5650, and not all r610s will have dual X5690. Some may only have one CPU, some may have a CPU of fewer cores or slower speed. So, be sure to verify the model of processor and that it has two.

    The r510 is newer than the C1100, and most I see on eBay have the 2.66GHz X5650, so I would expect that the r510 will render as fast as my C1100s, but the newer r510 will probably be a little more electrically efficient (using a little less electricity).

    The r510 has an optical disc drive, which means you can use a Windows CD/DVD install disc. For the C1100s, I reformatted a flash drive (at least 4GB capacity) to NTFS, which makes it a bootable device, and copied a Win7Pro install disc onto the NTFS flash drive.

    The r510 is 2U, meaning that each one takes up two slots in a rack. So if you planned on eventually accumulating eight r510 blades, you'd want to get a rack which is at least 16U tall.
    It appears that the r510 takes 3.5" hard drives. The C1100 uses 3.5" HD also, but only has four bays. The r610 uses laptop hard drives.

    I would expect the r510 to have the same screen resolution as the C1100 and r610; figure on 1280x1024 pixels.

    The prices for the r510 with two X5650 Xeons ("2x 2.66GHz twelve core") and a generous amount of RAM are quite reasonable. :D