![]() ![]()
#Screenflick with external imterface Patch#The patch defaults to a maximum of 5 minutes of recording at 30 FPS. ![]() Animate the parameters and then disable the toggle to stop the recorder. When ready to record an animation, enable the toggle on the PARAMETER-RECORDER sub-patcher.Play around and get a feel for the patch and what you want to record. Open the PATCH-AND-PARAMETERS sub-patcher to expose the controls. ![]() The process for rendering a 4k animated scene is as follows: The patch content is simply an animated jit.gl.bfg. Only single value parameters are supported in the current state (e.g. Jit.matrixset is used to record parameter data, and pattrstorage is used to expose parameters for recording. The included patch is intended as a proof-of-concept, and contains 3 modules (RECORDER, PARAMETER-RECORDER, and PARAMETER-PLAYBACK) that can be dropped in to an existing patch. You can then capture into Max or directly into your favorite video editor. Another similar option is to use a second computer with a capture card like a BlackMagic Intensity. The obvious downside is cost, as most of them cost more than your Max license. For live performances, all I do is pass my video through to the capture box and send a submix or program out from the sound board and I get a frame-accurate recording of the sound and image, exactly as it appears to the audience. An added benefit is that they support multi-channel audio capture (depending on the model) and come in HDMI and HD-SDI flavors. They even offer loop-through so you don’t have to use an extra port. These outboard recorders can handle pretty much any resolution or framerate you throw at them and don’t cost you any more processing than outputting to a second monitor or projector. Probably my favorite way to go is recording directly to an Atomos Ninja or Blade or a BlackMagic HyperDeck Studio Mini. Similarly, the CPU expense of recording in realtime is something I’d rather spend on extra vertices or layers of processing. A nice alternative is that you can rent these via Kitsplit or similar for as little as $10 per day (24hrs) so if you have a Jitter piece you need to record in high resolution/frame rate, renting a HDMI recorder might be a good alternative.Ī lot of my work relies on feedback and live input, making a frame-by-frame capture setup a non-starter. ![]() It’s not the most cost effective method if you purchase, but it does provide consistent results at showreel level quality. #Screenflick with external imterface free#You’ll find that because the graphics are off the machine you’re working on you can record at significantly higher frame rates and resolutions, and free up a lot of headroom for other Max work happening concurrently. #Screenflick with external imterface full#When your jitter patch is all ready to record, drag the jitter window displaying your patch onto the Atmos “monitor” and set it to full screen, hit record on the Atomos and you’re away. It has a HDMI input and a HDMI output (direct loop through from the input) and works just like an external monitor to your computer (albeit a bit smaller), simply take the HDMI output of your machine and connect via HDMI cable to the Atomos Ninja V, it’s that easy. The Atomos Ninja V is actually something that gets widely used in the film industry for recording off camera b-roll and the likes, but how does it work with Jitter? - You can think of it basically as a HDMI Recorder (records to attached SSD). This was until recently, I got a hold of an Atomos Ninja V. Over the years I’ve used the same setup or similar setup to my colleagues, either directly recording in Jitter or by using syphon. Now might be a good time to brush up on the difference between matrices and textures, texture output, or even what is a texture. This will send the window content out the jit.world’s first outlet as a matrix or texture. If your patch’s render-context and window is handled by the jit.world object (which it should be), then you simply need to enable or depending on which recording technique you are using (more on that below). Fortunately the latter case has an easy solution, jit.world. However if the content is OpenGL geometry objects ( jit.gl.mesh, jit.gl.gridshape, etc), or a scene post-processed with jit.gl.pass, it’s not always easy to identify. If the content is a chain of textures or matrices, it’s simply a matter of taking the output from the last object in the chain. In most cases the answer is: “exactly what I see in my window.” OK, sure – but we need to locate exactly which object or patch-cord in our patch is drawing to the window. Before we dive into the many ways to record Jitter output, let’s take a step back and first identify what we’re trying to record. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |