Showing posts with label avid. Show all posts
Showing posts with label avid. Show all posts

Thursday, February 11, 2016

Shooting & editing HDR via Avid using CLog gamma

We've got BVE2016 coming up and one of the things Root6 will be showing is an HDR workflow via Media Composer using Canon monitors.
HDR is still a bit of a crap-shoot as far as standardisation is concerned with the BBC/NHK system, Dolby Vision, Sony's SLog3 and Canon's camera-native CLog. The principle of using an alternate gamma so that you concentrate the bit-depth where you want the extra range is well established;


The hope is that all of these manufacturers will coalesce around ST.2084 which (amongst other things) defines how you handle the specular highlights; those very bright parts of the picture which give a real addition to the look of the pictures. These are typically defined to be >500 Cd/m2 which is MUCH brighter than broadcast white! The idea is that the last bit of dynamic range (10th bit - all values above 512) represent the highlights and everything up to 50% is akin to the usual video dynamic range. You calibrate the monitor such that 50% is set at 100Cd/m2 and just hope that the colourimetry of the highlights tracks RGB-wise!

So - Root6's own Dave Skeggs and I set off around Soho and London Bridge to capture some night time and daytime footage. We were using a Canon C300 mk.2 set to UHD (3840 x 2160) at 25P (no interlaced fields at UHD and no high framerates at that resolution unfortunately). We set the colour space to an optimistic Rec.2020 and gamma to Clog. In that mode the camera shoots 410mBit/s XAVC codec MXF files.
We've been using Media Composer v 8.5 on an HP Z840 workstation & the new Avid/BlackMagic DNXio video hardware; we had to update the firmware to get it to generate quad-link SDi. Although HDMI works it is nobbled down to eight-bit and so would not be suitable for this test. I would put a link to the video but none of the video sharing sites support HDR and neither does the screen of your tablet/laptop/TV! I took all the monitor photos with my Fuji bridge-camera in a very bright office; you'll have to take my word for it!

Notice the headlights of the taxi - you can see details inside the light!


exactly the same frame; notice the dark details in the trees against the night-sky.

 Of course on Media Composer's GUI display you get the CLog gamma rendered as if it was Rec.709 and so it looks very washed out and lacking in detail


You can have Avid flatten the gamma of source clips so that it looks OK on the GUI - that doesn't affect sequences that the clip mas been used in.

 
Quite a large range of alternate gamma and colour spaces

 It shows up in the bin-view which is useful

 
So now clicking the source window and setting the monitor to regular HD gamma (BT. 1886 fact fans) shows you what the same material shot on a "regular" camera would look like; very little detail in the blacks and none in the whites.
 
 Root6's own DOP; Dave "is that in focus?" Skeggs

I'd forgotten how limited a normal video-camera's dynamic range was. The Canon monitors top out the specular highlights at 400Cd/m2 which is somewhat less than a Sony BVM-X300 (1,000 Cd/m2!) but for €10k less than the Sony (and losing only a stop-and-a-half of specular highlights) the Canon 30" UHD/4k IPS panel represents superb value. I was a bit disappointed that the camera tops out at 29.97P at >2k resolution so I couldn't see how nice fluid video motion looked at high res; everything has a jerky film-look to it.
Steve Shaw at Light Illusion has a very good article exploring some of the fundamentals of HDR.

Friday, March 20, 2015

Reliable Avid ethernet traffic over old fibres

The first Avid Unity fibre-channel SAN I installed was in 1999 and fibre was the standard for high-speed shared storage for editing for more than a decade. However; since the introduction of MPEG4-based editing codecs which allowed Avid to offer high quality DNxHD we've been all about ethernet-attached shared storage; in Avid's case the ISIS storage products.

We have had a lots of customers who went to the expense of running OM3 (50 micron multi-mode fibre) cable only a few years ago and are now cheesed-off that they need to flood their facility with cat6 or 7 because the rough-old cat5e they have doesn't work reliably for gigabit. So; the obvious choice is to use that fibre which had bags of bandwidth for ethernet but until very recently that was not an approved configuration by Avid. Recently they have tested Allied Telesis media converters and given a cautious thumbs-up. BUT, they are very expensive (a few hundred quid per workstation, so back to running new cable), but I had a quick chat with the guys at Comtec about the house-brand they supply which uses the same chipset.

Neal Kemsley kindly ran Avid's PathDiag tool before and after and these little cheapies seem entirely transparent. Here's what he fed back to me;

"Windows ISIS client attached directly to a Dell N3048 switch running Pathdiag in an “Unlimited” Writes then Read PathDiag test cycle targeting an ISIS Workspace (the term used by Avid for their storage volume – the same as Unity). Unlimited means that the test attempts to saturate the channel between the ISIS client and is a good indicator of the upper limits of the potential connection between the client and the storage. In this case it is a 1 Gbit copper connection to the switch, and an optical 10 Gbit connection between the switch and three ISIS 5500 storage chassis. As you can see we are getting a result north of 100 MBs per second – yes that is MegaBytes – not bad for a 1 Gbit path! (Not too shabby as they say in Boston!)"

"This one shows the same test parameters but with the client attached to the switch using the media converter pair in circuit. Note that the test results are really similar perhaps with some minor variation in the upper reaches of the test during the write cycle but nothing serious. The overall speeds achieved during the tests are essentially the same as a direct connection and there are no errors displayed.
Note that the graph pattern drawn by the test is relatively clean showing very little spiking or variation and clean transitions between the Write tests and the Read test cycle. Note also that no errors are seen in the error count on the right side."

"We would want to run the test for several hours to draw conclusions on this but these results are very promising. We would probably want to also change the Transfer Size parameter of the test up and down to emulate different editor timeline characteristics. Smaller values are used to emulate working with heavily compressed material, the current setting being used to emulate working with DNxHD material, and larger values can be selected to emulate working with uncompressed HD and UHD material."

"To emulate working with several streams of data in a timeline this shows four independent PathDiag test sessions running simultaneously with the Media Converters in circuit. In this case rather than working with unlimited tests, I set the Transfer Rate parameter to 25 MB/sec and allowed the tests to cycle for 30 minutes. Notice for results graphed in these tests the individual tests are interacting somewhat – see that the top value levels are becoming choppy and castelated somewhat as each test competes for throughput. Since the tests in aggregate are pushing the maximum limit of the channel (if all four tests happen to be writing or reading simultaneously, the overall write or read bandwidth should around 100 MB/sec) this interaction is quite normal and will get worse if similar test were being run on further clients in the ISIS environment as each client competes for access to the storage."



Tuesday, December 02, 2014

Why do manufactuers over-specify power requirements for broadcast equipment?

It's actually a rhetorical question and I'm glad they do. Most of the time I have to tell a customers' electrician and air-con contractor how much power (and hence how much heat) the machine room will be pulling/genarating. Most customers refuse to believe that 99.9% of the electrical power entering a server room/TV MCR leaves it as heat! Just think about it; a 1v video signal leaving the room and terminating into 75 ohms represents a tiny amount of energy. Everything winds up as heat and so I've got to the point where I tell the electrician how many amps we'll need and the aircon guy how many BTUs of heat he'll have to move. By turning them into different units the customer stops complaining!
Anyway - why are the numbers always so different? I've been installing Avid shared storage chassis since 1999 when Unity v 1.2 was considered clever - 500 Gigs across three arrays and usable by around ten edit rooms. Fast forward to 2014 and the ISIS range are what you'll buy from Avid and the new ISIS 2500 near-line storage is just the thing for cheaper, non-edit storage.


This is the rear of this monster - two supplies with 20A C19 inlet connectors and you can see from the clamp-meter that the thing is pulling 1.3 - de-powering one of the supplies shows the current draw by the single supply rise to 2.6A (so they are properly balanced). Re-powering the thing shows that the total draw across both PSUs rises to 3.3A for around thirty seconds but settles to the total 2.6A once everything is up and running. 
So, P=IV and (not forgetting the inductive load which has a power-factor of 0.8) means we are seeing a bit less than a kW max. However - on the Avid website;

 

Thursday, September 12, 2013

Avid, KVM systems and the cry of "...not fit for purpose"!

There are several things that happen when you plug a monitor into a graphics card. I'm assuming DVI (this is the 21st century!) and all other displays standards; HDMI, DisplayPort and Thunderbolt all follow a similar principle.
  1. Pin 16 on the DVI connector on the graphics card is refereed to as the "hot plug detect" pin and is held logic-high (at +5v) through a very high value resistor. When you attach a monitor the pin is momentarily taken low to alert the graphics card to the fact that a monitor has been connected. 
  2. The graphics card generates an interrupt on the PCI-e bus
  3. Windows sees this and uses it to generate an EDID exchange request
  4. The monitor responds with it's EDID profile
  5. If the EDID profile is the same as last time the graphics card driver does nothing; it's the same monitor after all
  6. If the EDID is different the driver re-sets the system resolution - this is particularly important if it's either a lower resolution or frame rate than it was running at before; if it didn't do this you'd get black screens. Nobody wants that.
Now then, Avid Media Composer does something different! It listens for the interrupt and when it sees it it halts operation and displays an error message saying it needs to re-start. Even if all that has happened is that your monitor cable has fallen out of the back of the computer and you've reconnected it - go figure.

The upshot of this is that when you're using a KVM routing solution of any kind and you assign a new pair of monitors to the Avid it insists on re-starting. I've been aware of this for a while and I let people know about it when demo'ing or presenting at trade shows; but it's just the way of things. There is nothing you can do whilst Avid does the wrong thing. Amulet - my favorite KVM-over-IP system does exactly the right thing; it holds off asserting pin-16 (and triggering the chain of events above) only when absolutely necessary. A media operator can be switching around four Media Composers and each workstation is unaware that the operator is being promiscuous. Amulet only asserts pin-16 when there really is no other option; when a new desk-end "zero client" takes control of a machine it hasn't yet seen. The only place I've seen this to be a problem so far is when an editor starts a layback to videotape, presses disconnect on his desk-end zero-client and calls his operator saying "...I've started the layback, I'm off home; can you watch it through to the end?". Then, the operator tries to acquire that Avid and by necessity the Amulet has to alert the computer to a new pair of monitors and Media Composer halts (ruining the layback).

This was the issue at ITV Salford and I home-brewed a little gadget to stop the Avid being able to tell when new monitors where attached; I essentially neutered it's ability to detect pin-16. 
So, I think Amulet does exactly the right thing, if it didn't you'd loose the ability for workstations to detect what monitors were attached and pretty soon you'd have rooms where you had black screens; nobody wants that. The bogeymen here are;
  • Avid - why on earth doesn't it do what Windows does?
  • Editors who don't watch their own laybacks!
We are now in an argument with a customer who I explained this all to when I was demo'ing Amulet, but we didn't win the SI quote, but we still supplied the KVM. The SI who is installing is shouting blue-murder about "...not fit for purpose" and we're having to brew up 150 adaptors to keep everyone sweet. My feeling is this will cause more trouble further down the line just for the ability to not close an Avid project for those occasions when you need to hand a machine off to someone else. What kind of workflow needs that?!

Tuesday, March 19, 2013

Fixed the DVI / pin-16 hotplug dilemma

As with a lot of mod'ing or (dare I say it!) bodging solutions you need to find a nice connector or pre-made cable to base your fix on. If you look back at the problem we've been having with Media Composers switched across different Amulet heads then you'll recall it wasn't an EDID issue, rather one of Windows detecting a monitor change; Amulet does the right thing, it's Avid that's the problem.

The fix is easy; you need to tie pin-16 (hot plug detect) to Vcc (+5v on pin 14) via a 1K resistor;




The best mod'able pre-made cable that is suitable is one of these from Lindy.  Now I just need to knock up a dozen for the customer!

Tuesday, January 22, 2013

EDID isn't the only thing that graphics cards look at.

Avid Media Composer insists on re-starting when it sees a change of monitor. I always assumed that it was done via Windows detecting a new EDID profile but it turns out that if you unplug and re-plug the same monitor (i.e. EXACTLY the same EDID data - even s/n) then the same happens and so something deeper is at work.
Look at pin-16; Hot Plug Detect. Basically it is held low by the graphics card but pulled high when a monitor connects - this generates an interrupt in Windows which forces the card to do an EDID refresh - request a new profile and possible re-do the HDCP handshaking if needed. The interrupt is also seen by Avid and used to force a re-start of Media Composer.

Now then, Amulet (our favorite KVM-over-IP technology) spoofs all of this; it caches the EDID profile until a new client connects and when that happens it asserts the Hot Plug Detect pin as if a monitor had been connected - essentially spoofing what happens in the real world. There are various registry tweaks to for the graphics driver to ignore pin-16 but they work variably - change driver and suddenly pin-16 is being listened to again.

When I figure out a solution I'll update this post. Meantime the customer's dream of starting off a layback or capture and then handing it off to another client is still on hold....


Tuesday, February 10, 2009

High Def 25Psf video

More than a decade ago when Mr Sony was developing what would become HDCam (with some small contribution from the previous 1" open-reel HDVRS analogue & digital formats) they realised that progressive video was the future but existing HD equipment (typically the BVM-D series monitors) couldn't lock to such a slow framerate (24/25/29.97 as opposed to 48/50/59.98 fields). The answer for progressively-sourced pictures was the Psf standard which makes progressive frames look like interlaced video. So as to make film people think that this was better than video they have a new name for a field - the segment. In fact Psf is interlaced video (but there is no movement between the fields) - it just shows that good old interlaced video is able to faithfully reproduce progressive pictures (but the reverse is not true as progressive video with the same frame-rate has only half the motion rendition as interlaced video).

So - let's dismiss a couple of misconceptions;

  • There is no difference between a Psf signal and an interlaced signal from a technical standpoint
  • Sending 1080 pictures via Psf doesn't degrade them in the slightest - in fact if you're laying off 1080 to HDCamSR then anything below a 5800 (in 50/60P mode @880mBits) is recording Psf!
Now then - below are screen-grabs from my trusty WFM7120. The first shows the output from a Symphony NitrisDX BOB. The footage had come from a Sony EX3 cameras recorded at 35mBits 1080/25P onto Memory Stick and imported straight into a progressive timeline. The Avid plays back Psf which the Tek shows as 1080i (for the reasons discussed above). Laying this off to HDCamSR (a 5500 deck) gives a 25Psf recording on tape. The second screen shot is the Quicktime sample movie imported into a new 25P timeline - it just serves to comfirm that the BOB output is always Psf.



This last picture is the down-convert output of a Leitch X75 which is a great little get you out of trouble box (basically does everything->everything with a few extra tricks thrown in - profanity delay etc.) but it's not a multi-frame broadcast standards converter (like a Snell & Wilcox Ukon).

Although you can't see it from this screen-grab the SD output has had it's field-dominance changed and the quality of the video ain't great. This wouldn't be an issue for VHS/DVD review copy or if it was the pre-processed feed for web-conversion but it's not suitable for SD delivery.
For that the best option (short of a £20k Ukon!) is to use the SD-downconvert from the HDCamSR machine.

Tuesday, July 03, 2007

The futility of re-using old storage

I was over at a facility doing a site survey for a fibre network and the owner/operator told me how he planned to re-deploy his ancient 300gig LanShare storage system as an MP3 storage pool. I did a little mental calculation about the economics of re-using old storage; That model of LanShare consumes a bit less than a kilowatt of power - now I'm assuming he's paying seven or eight pence for a Kw/h and so by doing the maths (and bearing in mind the cost of a 300 gig drive - £69.00 inc. VAT today) he's burning that much electricity every three weeks! Even setting aside the cost of administrating ANY Avid storage (and the licensing considerations, needing client connection software etc. etc.) there is no good reason to re-use an old LanShare or Unity.
I had the same argument a few years ago with someone who wanted to re-use a 10x9gig fibre array - the cost of a fibre HBA to allow him to re-use it in a PC was many times the cost of a 120gig drive (and I didn't even do the power calculation) - it's NEVER worth it.

A facility I worked at in the mid-nineties had Paltex edit controllers (of a mid-eighties vintage!). Anyhow - back then the EPROMs containing the software were only 32 Kbytes big. Eventually the system software got to requiring 64 Kbytes of space and so they had to produce an updated system board that actually had the A16 line wired. Imagine my horror when (to avoid the £800 upgrade cost) I had to manually wire (with kynar wire) the most-significant address line on old 32K-capable system boards! By the time I'd finished a couple of them they looked like birds' nests and were about as stable as Charles Manson!
Incidentally - the kilobyte is an obsolete measurement of memory size. Back in the eighties skilled programmers could pack a lot of functionality into a 'K' - I saw a chess program that played a very respectable opening coded in that much space. Nowadays (when programmers are called developers and bolt objects together rather than writing computer code) the megabyte will soon the an obsolete term.
The megahertz is an obsolete term that refers to the speed of old microprocessors' clocks....

Sunday, June 03, 2007

How Avid differs from all other broadcast manufacturers

Avid have this get out of gaol card they play often - it's called the ¨..that's not a supported configuration¨ reason. They play it whenever their badly implemented version of an industry-wide standard doesn't play nicely with other equipment. The first time I came across it was the way they do P2 protocol over RS422 on the ABVB-based Mac systems. I was working at Oasis TV back in 1996 when we started to hook original D5 machines (the Panasonic AJ-D580) up to 9500-based V.7 Avids. We noticed that when capturing the deck would often take off in the wrong direction and then reverse and pre-roll correctly. Often laybacks would fail or be a frame late. Eventually I borrowed a serial sniffer and stuck it across the RS422 out of the Avid and discovered the following.

  • The Avid rarely issued commands in the 17-line window after the start of frame that P2 demands. Consequently the deck would queue the command and ignore it for a frame.

  • The Avid would sometimes issue the pre-roll command before it loaded the counter - the poor old VTR would take off on the pre-roll only to realise that it was going in the wrong direction.
So, a broadcast grade VTR that conforms to every relevent standard doesn't play nicely with Avid's sloppy implmentation of an existing standard and their justification is that "...it's an unsupported configuration"!
I've just put in a couple of machines that rely on an outboard FireWire switcher. I would have sworn it was the switcher that was at fault but I had a Root6 ContentAgent to test the various ports with the VTRs. The Avid Mojos will drop the feed if you do anything with the switcher (like make another route!) and you have to re-launch Media Composer. It makes using FireWire decks a real pain and I didn't realise Mojos were so problematic until Graham sent me the following;
This just mirrors my experiences...
Firewire switches, hubs and splitters don’t work CONSISTENTLY. They just don’t. Firewire patchbays and good quality extenders DO.
I have tested half a dozen of these things and they just don’t work without a reboot, a reset, and a mess around.

The same can be said of SCSI, Fibre Channel, and even the way they handle the PCI bus - see a previous post here for details of their half-arsed ADAT implmentation.
Now, if a Sony VTR refused to record a standard PAL signal or a Grass Valley mixer wouldn't derive a key from an SDi feed then they would sort it out - I never heard the phrase ¨...I'm sorry, that's not a supported configuration¨ until Avid came on the scene!

Wednesday, March 28, 2007

Adrienne Electronics

Do you remember a BBC2 late-night show called Diners - back in 2002 anything that had the reality label attached to it got commissioned (although admittedly on very late in the evening). If you Google for it now there is scant evidence of it! I did find a John Walsh article from The Independent. Anyhow - one of the problems I had to solve for that show was using cheap Avids (software-only versions - ExpressDV back then) to capture or log live feeds with timecode. Of course the studio or OB sends you audio timecode (the kind that sounds like a fax and comes down a twisted pair cable) but all hardware-less edit stations assumed the timecode comes down the RS422 line as part of the machine control.
Adrienne Electronics do a range of really useful boxes to address these kind of problems - the AEC-Box-2 takes in audio code and has a 9-pin connector. It emulates a VTR but (being a solid box) doesn't actually play or rewind tape - it just tells the Avid it is doing so (in this case it's an Avid MCSoft with SDi Mojo). The really cool thing is that when the Avid asks for timecode down the RS422 the box returns what it is receiving on it's input. It's the ideal solution for using cheap workstations to log or capture proper studio or OB type material.
Now I'd been puzzling about this for ages last night (got home late from the studio where I was working) and it was only over my tea that I remembered solving the problem five years before - kinda like how everyone has forgotten Diners!

Wednesday, March 07, 2007

Got my Mojo working

I did investigate this the Production Show before last (2005) - I thought I'd so this but include balancing for the audio to make it a bit more pro - XLRs & BNCs on the back. I got Bryant to do me a quote for the metalwork and adding in the cost of a buffer card to balance/level match the audio brought it to about £250 cost to us. With that in mind I asked maybe a dozen people at the show who were looking at ExpressPro/Mojo if they'd consider it and what price - they ALL thought £250 would be a waste of cash.

Now - what with SDi Mojo maybe it's worth re-investigating. MC Soft (the current less-than-£5k Avid package) appeals not only to wedding videographers but broadcasters and serious facilities people. In fact, Root6 have asked my to prototype one up - watch this space.

Tuesday, December 12, 2006

Magic Avid!

This is a killer! Avid have a dual-boot solution that allows you to have both NitrisDS and SymphonyDS apps on the same machine - but rather than doing something modern with a partitioned drive and bootloader they have the Avid dual-boot option that uses a key-switch to power one of two system drives! All built into a nice metal panel that fits in a 5 1/4 inch drive bay and just switches the drive power. I laughed heartily when Joel showed me that.....

Thursday, September 29, 2005

My four biggest beefs with Avid
  • Digi002, ExpressPro Studio, and eight channel audio.
    To get eight channels in and out of a Digi002 console you need to use ADAT 'light-pipe' which necessitates a converter of some sort. Quite why a "pro" product relies on a domestic-style interface for multi-channel work is another topic (not too many VTRs, multi-track tapes or disks come with ADAT, but still). The thing that has really bitten us in the backside is that neither of the models Avid recommend work reliably with the Digi002 (the Fostex UC-8 and the Dua2) and the one gadget that does work (the Alesis A14) only works on short ADAT cables (this is a single-mode fibre - in any industrial grade application you'd expect it to work over kilometres, not be limited to three metres!). This flies in the face of Avid's advice - "you can run long ADAT fibres, but don't try and extend the FireWire between the workstation and the Digi002" - well, precisely the reverse is true! Big shout to my colleague Chris Bailey for figuring this all out and to Joel and Rhys who I worked late with last night out in the sticks in Hertforshire testing this (amongst other things).
  • Mojo analogue video performance - still bad, noisy when in component mode (yet Avid claim it's broadcast quality - where, precisely?!). See a previous post here.
  • Adrenaline and NTSC reference - when using Adrenaline in 525 mode it has to have a proper Sc-H consistent sync source - this is fine, but by letting people get away with a cheap'n'cheerful black & burst generator in PAL mode you give clients the expectation that they've gotten away with additional expense - if you're going to lower the bar then do it consistently or not at all.
  • This pinched from the Avid-L today
    Almost as shocking as the actually time code problem is that Avid refuses to comment on it or address it. Pretty shameful.
    How about if I say that we're aware of it and committed to fix it? Would that be enough? Sincerely, Jeremy Kezer
    (Jeremy is Avid's principle engineer)
Still, compared to Final Cut Pro and Decklink these are mere annoyances!

Friday, April 23, 2004

This from the FAQ in Avid's sales guide (issued at NAB this year):

I've played with Media Composer Adrenaline on both Mac and Windows and there is a real performance difference. Are you making the Mac version run slower?
Of course not, we optimise for both platforms. At this time we are seeing significantly higher real-time performance on current Windows-based workstations than G5 workstations from Apple.


So now we know!