• AI-Bias
  • BlendBuddy
  • ColdSpotting
  • Procedural Generations
  • EvilEngineer
  • iOS Software Products
  • Mach Banding - Visual Illusions
  • YouTube Experiments
  • Blog
  • Contact Us
Menu

ShowBlender LLC

1281 107th Ave
Oakland, CA, 94603
High-Resolution Solutions - Apple Software Development

Your Custom Text Here

ShowBlender LLC

  • AI-Bias
  • BlendBuddy
  • ColdSpotting
  • Experiments
    • Procedural Generations
    • EvilEngineer
    • iOS Software Products
    • Mach Banding - Visual Illusions
    • YouTube Experiments
  • Blog
  • Contact Us

Procedural Generations - 3 Display Audio Response Test

November 4, 2014 Joe McCraw

Demonstration Machine Specs:  MacBook Pro (Retina, 13-inch, Mid 2014), 2.6 GHz Intel Core i5, 8 GB 1600 MHz DDR3, Intel Iris 1536 MB.

This is the first test of incorporating audio responsive elements(in this case via the mic input) to the Procedural Generations Native-OSX Application Product Line.   As audio input volume fluctuates, our software responds visually, in this case by modulating certain aspects of our motion graphics in compelling ways.  We have the ability to isolate Low-Medium-High frequencies separately to create visual compositions that reflect the acoustic dynamics of live audio inputs.

In this case, all three displays driven by a single i5 Retina Macbook using 3 different instances of Procedural Generations software.  2 External 1080P displays(via HDMI and Thunderbolt->HDMI), as well as the internal Retina 2560x1600 monitor, a staggering 8-Megapixel(8243200!) @60FPS! 

Live Audio Visualizers have a wide variety of live event applications.  We will develop efficient(native OSX applications) motion graphics applications for your specific live event.  

For example, we can work with your talent, to create an interface with their:

  • drum machines(via MIDI and TouchOSC)
  • lighting commands(via DMX or ArtNet)
  • using network commands over ethernet(Non-blocking/Event Driven).

Future improvements(currently in development) include Audio Input routing(improving latency, sensitivity, and accuracy), and integration with MIDI, and TouchOSC events.

Let us know what you think!

Tags procedural generation, high resolution motion graphics, motion graphics, ShowBlender
← Mach Banding Illusions and our Visual Perception of GradientsCellular Automata Test →

E-mail

 

 

 

Blog
ShowBlender is closed for business
about 2 years ago
Postponavirus Prototype: Using Machine Learning to discourage disease transmission via face-touching
about 5 years ago
Designing for the New A(I)udience
about 5 years ago
Videos Generated from Still Images using SinGAN
about 5 years ago
Automated Colorization of Black and White Images
about 5 years ago
UI Updates for ColdSpotting!
about 5 years ago
See No Evil
about 5 years ago
Utilizing Viral Youtube Challenges as Curated Data Sets for Deep Learning
about 6 years ago
e2 Tally System using Octopus and Swift
about 6 years ago
ColdSpotting - Wifi Network Diagnostics in Augmented Reality
about 7 years ago

Copyright 2014 - Showblender llc