You are here

Measurement based converters evaluation database

All about the tools and techniques involved in capturing sound, in the studio or on location.

Measurement based converters evaluation database

Postby Mr.Michaelz » Mon Jun 11, 2018 9:00 am

Hello everyone,

I view converters as a simple studio commodity, something that should be technically accurate.
While it's up to every individual to assess if a given device will fit their workflow, I cannot see why it would be impossible to compare converters on a technical basis ; maybe with something like loopback tests performed in a standardized and stable environment.

Does something like this already exist?
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby SafeandSound Mastering » Mon Jun 11, 2018 9:38 am

Some years back there was a software known as RMAA and end users often published DA/AD information online. There is some on the Audio Rightmark website albeit old and I think GS had some end user testing. It did not cover every technical parameter but gave noise, THD and frequency response figures etc. I am sure you could search that up and find something online.

Good converter design is far from simple and to achieve the top end specifications you need to have a lot of electronic engineering design knowledge. You will find mastering engineers discussing their preference for one £2,000.00 - £4,000.00 set of DA/AD over another so there are subjectively audible differences and preferences even with similar technical specs. They are all going to be up to the task on terms of spec alone yet some swear by their coveted choices. So specs do not tell the entire story.

There are also some differences depending on what equipment chains they are coupled with so this variable can change the sound. Though arguably good input/output stages on DA/AD should sound fairly consistent coupled with different equipment chain input/output impedances.

Many of the latest generation DA/AD are based on ESS SABRE 32 bit chips so there has been some advancement in this area of design over the last 5 years or so.

Subjectively even at the higher end of conversion you will hear talk of low mids being less blurred, slight differences in upper mids and top end, ever so slight smiley curves or forward mids, different transient behaviours, stereo image differences, slightly different deepest low end extension etc. All obviously very subtle and likely only heard in exceptional listening environments (which mastering engineers should have).

These very differences do exist, especially so in lower end soundcards and at mastering grade they come down to preferences and giving an engineer the sense of having "the edge" rather than ability to break a mastering end result.

Specs do have a place but are not the full picture. Testing at least confirms their engineering prowess.
User avatar
SafeandSound Mastering
Frequent Poster
Posts: 1120
Joined: Sun Mar 23, 2008 12:00 am
Location: South East

Re: Measurement based converters evaluation database

Postby Sam Inglis » Mon Jun 11, 2018 10:53 am

Does anyone claim that it's *not* possible to compare converters on a technical basis?

All the manufacturers of high-end converters that I can think of publish detailed specifications, and it shouldn't be hard to compare them.
Sam Inglis
Frequent Poster (Level2)
Posts: 2317
Joined: Fri Dec 15, 2000 12:00 am

Re: Measurement based converters evaluation database

Postby Mike McLoone » Mon Jun 11, 2018 12:17 pm

It is certainly possible to compare AD and DA converter performance from manufacturer specifications. These specs have been measured using an audio analyzer, and are a perfectly valid method to compare the performance of different hardware.

While there are some (especially in the hi-fi world) who will argue that there are differences which can only be heard and not measured, using a calibrated audio analyzer will give a good idea of converter performance. These analyzers have a larger dynamic range and significantly larger frequency range than any human ears. A Prism dScope will set you back around 6k, an Audio Precision analyzer around 15k. The RMAA software will technically give a similar measurement in software, but without the dedicated and calibrated hardware front-end, you are never going to get an absolute measurement of THD&N or noise or anything. If you are using an audio interface to measure itself, then you have the error on the output and the error on the input to deal with, therefore no reference, therefore no absolute measurement.

From the Audio Precision there are some documents which are available for free and may be of interest (you need to create a login though, but you don't need to spend the 15k on the hardware!):

https://www.ap.com/technical-library/

Have a search for these:

Technote 104: Introduction to the Six Basic Audio Measurements
How to Write (and Read) Audio Specifications
Technote 108: PC Audio Testing

Best,
Mike
Mike McLoone
Regular
Posts: 82
Joined: Wed Apr 23, 2003 11:00 pm
"It's all gone quiet." said Rhubarb "Not nearly quiet enough." said John Cage

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Tue Jun 12, 2018 10:29 am

Thank you very much for your balanced and informative answers.

SafeandSound Mastering > There still are some traces of benchmarks that have previously been performed, but most of them disappeared (the discussion has apparently been very heated). As for loopback tests that are currently posted, I think they lack the standardized environment that would make them really relevant.

I understand that there will always be a subjective part to converters evaluation, and I also understand that raw technical performance does not tell everything about how an interface can be integrated in a given studio for example.

Converter design seems indeed very complex, and that's why I think that measurements have their place if they are taken for what they are and nothing more: evaluation of the transparency of the converter and the signal path.

Sam > Some people seem to claim this, yes. I even read people stating that a given converter can sound more "analogue" than another, which left me in complete confusion.

Companies publish their specs, but they are not always that detailed, and we don't know much about how they have been measured. We do not have a standardized environment.
And I must say that I always take measurements published by the guy who wants to sell his product with a grain of salt.

Mike > Thank you very much for your very informative answer and for pointing me to the AP documents.
It's very helpful as it helps me understand what it would take to create a really trustworthy testing setup.

In these times where marketing is all over the place, I think that people can only rely on themselves to get really accurate information. It's true that 15k is a scary figure when you are alone, but when 100 people put 150, it's far more acceptable to me. I know that is money I would easily spend, as it could potentially help me save thousands in investments later. It would also help me feel confident about those investments, and stop wasting time wondering if I made the right choice.

Do you think that crowdfunding a test lab could be done and be beneficial to the community?
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Tue Jun 12, 2018 10:47 am

Although I don't get to review every converter that passes through SOS, I have done a fair few and always use the same methodology. It's very rare to measure any significant differences in frequency response (for example) -- they are pretty much all perfectly flat across the entire audio band. But there are differences in the anti-alias/reconstruction filter impulse responses and, for those (few) that employ minimum-phase filter designs, the phase shift across the audio band. Where I've found points worthy of note I always include them in my reviews.

I don't currently have the facilities to test and measure converter jitter artefacts, but I think more differences between converters would be observed from that -- as you can see from the plots included in an article I wrote about external clocking:

https://www.soundonsound.com/techniques/does-your-studio-need-digital-master-clock

Look at the different jitter artefacts from different converters (when running on their internal clocks -- all bets are off when clocked externally!) The artefacts are all very low level, but we can definitely perceive things below the noise floor and these individual artefacts will inevitably interact with the programme content in subtle ways.

However, I have maintained (for my own records) a table of the A-weighted AES17 dynamic range measurements I obtain with an Audio Precision test set, as I feel this provides a pretty reliable indication of the overall quality of the converter design (including its jitter performance). This is because to maximise the AES17 figure the complete technical design has to have an exceptionally good power supply, analogue and digital grounding, internal clocking stability, optimised use of the conversion chip, and superb analogue stages.

It's not a perfect indicator of perceived quality by any means, but it does tally pretty closely with my own perceptions...

So, here are my A-weighted AES17 dynamic range measurements (in decibels) for tested A-D converters, ordered from best to worst. Although these are all nominally 24-bit converters, the equivalent conversion word-lengths run from 20.6 down to 17.1 bits...

    RME ADI-2 Pro 124.0
    Lavry AD11 123.0
    Universal Audio 2192 122.0
    Lynx Hilo 121.3
    Current Focusrite ISA card 121.3
    Merging HAPI 121.0
    Grace Design M108 120.5
    Apogee Symphony 120.0
    Orignal Focusrite ISA card 119.5
    Prism Lyra 2 118.0
    Prism Titan 118.0
    Crookwood M1 118.0
    Focusrite RedNet 117.0
    UAD Apollo 117.0
    Audient ASP 880 116.5
    Antelope Audio Orion 116.0
    Burl B2 115.0
    Drawmer A2D2 113.7
    Millennia Media AD596 113.0
    Audient ASP800 112.0
    SSL Alpha MX 112.0
    Ferrofish A16-mkii 109.0
    Yellotec PUC2 109.0
    Ferrofish A32 105.0
    Behringer ADA8200 103.0

For A-Ds, I would categorise converters below 106dB as using antiquated technology or having a compromised or poor design. A-D converters measuring between 105-111dB are decent project and semi-pro level, while anything devices achieving 112-118 dB can be categorised as an excellent professional-quality A-D converter.

Anything in the 120dB region is a genuine flagship high-end product, but A-Ds exceeding 121dB define the current state-of the-art.

Here are the A-weighted AES17 dynamic range figures for tested D-A converters (in decibels). Again, the equivalent word lengths run from 21.4 down to 17.4 bits.

    Apogee Symphony 129.0
    Merging HAPI 126.3
    Benchmark DAC2 HGC 125.0
    Universal Audio 2192 125.0
    RME ADI-2 Pro 121.0
    Antelope Eclipse 384 121.0
    Lynx Hilo 120.5
    RME ADI-2 DAC 120.1
    Grace m905 119.7
    Focusrite RedNet 119.0
    Crookwood M1 119.0
    Focusrite Forte 118.0
    UAD Apollo 118.0
    Benchmark DAC1 117.6
    Drawmer HQ 117.0
    Prism Lyra 1 116.0
    Prism Titan 116.0
    Antelope Audio Orion 116.0
    Cambridge Audio DAC Magic 115.9
    Burl B2 114.5
    Ferrofish A16-mkii 113.0
    Grace m902 112.6
    Mytek Liberty DAC 112.5
    Lindell DACX 112.0
    Ferrofish A32 111.0
    TC Electronic Clarity X 111.0
    Yellotec PUC2 107.0
    Behringer ADA8200 104.5

For D-As, the AES17 numbers dynamic range are generally a little higher, but anything managing more than 125dB is exceptional. 116-122dB is genuine top-notch professional quality, while 110-116 is good solid project studio fayre. There's no excuse these days for anything achieving less than 110dB...

Hope that is of assistance and interest... It's interesting to note that the Prism Orpheus doesn't come out especially well in the AES17 tests, yet it sounds stunningly good. It's an odd anomaly, but I note that the Orpheus appears to have one of the cleanest jitter spectra, with no obvious (anharmonic) artefacts at all below the test-tone frequency -- unlike most of the others tested in that article above. And the worst sounding converter has the strongest and most numerous sub-test-tone frequency artefacts... A causal link perhaps? ;-)
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby blinddrew » Tue Jun 12, 2018 12:12 pm

Makes me wonder if I should get the third input on my DACmagic fixed.
But given the untreated room and hi-fi amp and speakers I think I'll leave it further down the list! :)
User avatar
blinddrew
Jedi Poster
Posts: 4716
Joined: Sat Jul 04, 2015 11:00 pm
Location: York
Ignore the post count, I have no idea what I'm doing...

Re: Measurement based converters evaluation database

Postby Martin Walker » Tue Jun 12, 2018 2:25 pm

Fascinating post Hugh - thanks for all that sorted data :clap:


Martin
User avatar
Martin Walker
Moderator
Posts: 12536
Joined: Wed Jan 13, 2010 8:44 am
Location: Cornwall, UK

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Wed Jun 13, 2018 2:22 pm

Thank you very much for your great post Hugh, it is indeed very helpful!

By the way, thank you also for your article about clocking, which is always a trustworthy reference when discussions lean towards the voodoo side of things a bit too much.
It also reminded me that I think it would be interesting to evaluate the converters’ performance when internally vs. externally clocked, and external clock recovery, as channel counts tend to go higher even in home studios.

More thanks for your measurements! I’m quite surprised by a few things in that hierarchy and I think it tells a lot about how important it is to rely on real world testing for specs.
It also makes me regret that you are not reviewing all the converters and interfaces that SOS publishes, because some big names are missing. I also think it would be very beneficial for readers to see a systematic measurement set published in every review for this type of device.

From what I understand, what you miss to perform thorough measurements is a dScope, am I right?
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby James Perrett » Wed Jun 13, 2018 2:54 pm

Mr.Michaelz wrote:I also think it would be very beneficial for readers to see a systematic measurement set published in every review for this type of device.

I've just made a similar point in another thread. Some magazines used to have two sections to a review - a general operational review which is similar to most modern reviews followed by a separate technical review done by a different person if the operational reviewer didn't have the appropriate test gear.

I've been reading elsewhere that some budget audio interfaces use the same filters for both 48kHz and 96kHz sample rates - this is the sort of thing that could be easily overlooked by a non technical reviewer but instantly picked up by a simple frequency response check.
User avatar
James Perrett
Moderator
Posts: 7426
Joined: Sun Sep 09, 2001 11:00 pm
Location: The wilds of Hampshire
JRP Music - Audio Mastering and Restoration. JRP Music Facebook Page

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Wed Jun 13, 2018 2:59 pm

Mr.Michaelz wrote:I’m quite surprised by a few things in that hierarchy and I think it tells a lot about how important it is to rely on real world testing for specs.

As I said, the AES17 figure is certainly not the be-all and end-all, and some products are in unexpected positions in the listing, but achieving a high score does require good engineering throughout the design so I believe it is a useful indicator of design excellence, if not subjective sound quality.

It also makes me regret that you are not reviewing all the converters and interfaces that SOS publishes

We do the best we can, but it's the practicalities of publishing, really. There are only so many hours in the day that I want to spend measuring and writing... :-)

I also think it would be very beneficial for readers to see a systematic measurement set published in every review for this type of device.

Or, indeed, every device... Yes, it would be fantastic if we could, but again, the practicalities make that impossible -- not just the expertise and equipment to make valid measurements, but also the space to publish the results. Moreover, it could be (and has been) argued that such technical depth would only be of interest to a small subset of readers anyway, and might even frighten others off the magazine completely.

From what I understand, what you miss to perform thorough measurements is a dScope, am I right?

A dScope would do the job nicely, yes, and I did borrow one for that article. Sadly, we don't have a budget to buy one, though, and Prism aren't inclined to lend us one on a long-term loan (unlike Audio Precision).

H
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Wed Jun 13, 2018 3:04 pm

James Perrett wrote:I've been reading elsewhere that some budget audio interfaces use the same filters for both 48kHz and 96kHz sample rates

:o Shocking! But I'm struggling to see how that would be achieved since everything uses delta-sigma converters these days in which the anti-alias and reconstruction filters are always performed digitally and their corner frequency is inherently locked to a proportion of the sample rate.

H
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Thu Jun 14, 2018 12:15 pm

Hugh Robjohns wrote:As I said, the AES17 figure is certainly not the be-all and end-all, and some products are in unexpected positions in the listing, but achieving a high score does require good engineering throughout the design so I believe it is a useful indicator of design excellence, if not subjective sound quality.

Design excellence would be worthy of evaluation in itself! I think it would also be interesting to figure out which measurement has the most influence on sound quality, and the larger the sample series, the most robust the analysis would be.
Having said that, I think that if one wanted to be thorough in analysis, it may also be beneficial for them to completely forget the subjective part of the review. Discussion seems to go wild when both are mixed.

Hugh Robjohns wrote:Or, indeed, every device... Yes, it would be fantastic if we could, but again, the practicalities make that impossible -- not just the expertise and equipment to make valid measurements, but also the space to publish the results. Moreover, it could be (and has been) argued that such technical depth would only be of interest to a small subset of readers anyway, and might even frighten others off the magazine completely.

I understand that SOS maybe is not the best place to go full nerd, and you certainly know your target audience better than I do.
Do you think a dedicated website like this one would be more appropriate? http://src.infinitewave.ca/

Hugh Robjohns wrote:A dScope would do the job nicely, yes, and I did borrow one for that article. Sadly, we don't have a budget to buy one, though, and Prism aren't inclined to lend us one on a long-term loan (unlike Audio Precision).

Candid question: if the test unit is a Prism, could the neutrality of the test be argued (both technically and subjectively), given the fact that they also produce converters?
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby Matt Houghton » Thu Jun 14, 2018 12:48 pm

Mr.Michaelz wrote:I understand that SOS maybe is not the best place to go full nerd, and you certainly know your target audience better than I do.
Do you think a dedicated website like this one would be more appropriate? http://src.infinitewave.ca/

You have to remember that you need both access to all the converters being measured, and to expensive test sets for every person doing the measuring. In terms of budget and logistics, that's not trivial. That SRC database is possible because it's a software process. Lots of people can run a standard test in software.

Perhaps the best-placed organisation to do something like this would be a test analyser manufacturer — one who doesn't manufacture their own converters, and has no vested interest other than to demonstrate the brilliance of their test sets! But then... if everything were measured and published, who's going to buy their test equipment!? :headbang:
Matt Houghton
Frequent Poster
Posts: 911
Joined: Tue Aug 07, 2007 11:00 pm
SOS Reviews Editor

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Thu Jun 14, 2018 1:46 pm

Mr.Michaelz wrote:Do you think a dedicated website like this one would be more appropriate? http://src.infinitewave.ca/

Probably, but it would be an expensive undertaking to set up, populate, and maintain, with a pretty limited target audience and the potential risk of litigation from unhappy manufacturers.

Candid question: if the test unit is a Prism, could the neutrality of the test be argued (both technically and subjectively), given the fact that they also produce converters?

Depends who is performing the testing, surely?

The dScope is undoubtedly a very high quality and accurate audio measurement tool, widely used across the industry not least due to its impressively wide range of testing facilities which go beyond those of some other popular test sets.

In the period that i had access to a dScope I was able to compare its measurements with those of the AP x515 I normally use and they came up identical when I measured a range of different products, so I have no concerns that the dScope in some way favours Prism converters, if that's what you're implying!

H
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Fri Jun 15, 2018 9:10 am

Matt Houghton wrote:You have to remember that you need both access to all the converters being measured, and to expensive test sets for every person doing the measuring. In terms of budget and logistics, that's not trivial.

I understand that it would be very costly to buy a device for every SOS reviewer! I was not necessarily thinking about you (as SOS, not personally) to do those tests.
The way I imagine it could be possible is maybe by getting a testing device and setting it up somewhere (in a university lab for example).
Having access to the converters is where you would have a natural advantage as they are already sent to you, but maybe we could imagine certain shops would be ok to lend demo units, and some users would be willing to lend their equipment to build the database.
I think doing it that way could keep the costs down, but we would need to create a quite large and motivated community around the project.
Assessing if there would be sufficient interest to crowdfund the initial investment and to source the converters was my initial motivation when I posted the topic. It may not be the case, from what I read in your answers.

Matt Houghton wrote:That SRC database is possible because it's a software process. Lots of people can run a standard test in software.

I agree it's far easier to do software assessment - I was posting this as an illustration of a possible way to present the results in a dedicated space.

Matt Houghton wrote:Perhaps the best-placed organisation to do something like this would be a test analyser manufacturer — one who doesn't manufacture their own converters, and has no vested interest other than to demonstrate the brilliance of their test sets! But then... if everything were measured and published, who's going to buy their test equipment!? :headbang:

:D I think that the ones who could be the most motivated to do this are the end users: people who are striving to get the best possible sound, and investing thousands to achieve this. This is heavy investment for some people, and I thought a scientific reference could be of help when it comes to choosing a frontend.

Hugh Robjohns wrote:Probably, but it would be an expensive undertaking to set up, populate, and maintain, with a pretty limited target audience

I do not think that building the website and maintaining it would be costly per se, as hosting prices are quite low nowadays, and there are great content management systems.
Building the lab, sourcing the converters and finding the people to perform the measurements is another story indeed.
By the way, how long would it take to test a converter (with the full measurement set, supposing we had the necessary equipment)?

Hugh Robjohns wrote:and the potential risk of litigation from unhappy manufacturers.

That would be another story then. But a test is just a test - how could a manufacturer do anything about this?

Hugh Robjohns wrote:In the period that i had access to a dScope I was able to compare its measurements with those of the AP x515 I normally use and they came up identical when I measured a range of different products, so I have no concerns that the dScope in some way favours Prism converters, if that's what you're implying!

Thank you for your answer on this.
I was shooting in the dark here, but I was wondering if people could find a bias in the testing procedure.
I am not technically knowledgeable enough to know if a Prism clock signal is more easily recovered by a Prism converter, for example.
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Fri Jun 15, 2018 9:55 am

Mr.Michaelz wrote:I do not think that building the website and maintaining it would be costly per se, as hosting prices are quite low nowadays, and there are great content management systems. Building the lab, sourcing the converters and finding the people to perform the measurements is another story indeed.

Quite so. The website is the trivially easy and cheap bit. Investing in the test gear and a knowedgable engineer to use it is very expensive. Shipping converters around the world from manufacturers to test house is also very expensive and, as some of them cost several thousands of pounds, the shipping insurance costs will also be ... er... very expensive...

By the way, how long would it take to test a converter (with the full measurement set, supposing we had the necessary equipment)?

A couple of hours, assuming nothing untowards is found and connectivity is straightforward, although it depends precisely what range of tests are required, and how many channels need to be tested, obviously.

But a test is just a test - how could a manufacturer do anything about this?

There are different ways of testing things and different measurement standards. For example, I prefer to use A-weighting when performing AES17 dynamic range tests, but the AES standard actually calls for CCIR-2K weighting. The numbers will be different with different weighting filters, and casual readers may not appreciate why your tests give different numbers from a manufacturer's specs. A manufacturer may perceive this as damaging to their reputation and sales, and the more litigious will try and do something to prevent your publication.... Such situations can be defended, but it all adds to the administration and running costs... At the very least, you would need to be very sure of the legitimacy of your test procedures and calibration of your test equipment...

H
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Mon Jun 18, 2018 4:42 pm

Hugh Robjohns wrote:Quite so. The website is the trivially easy and cheap bit. Investing in the test gear and a knowedgable engineer to use it is very expensive. Shipping converters around the world from manufacturers to test house is also very expensive and, as some of them cost several thousands of pounds, the shipping insurance costs will also be ... er... very expensive...

I think I got your point :D
Maybe if one wanted to try this, the best way to do it would be to stay near a big "music city", where there are quite a few studios, a big community and also distributors; maybe avoiding shipping as much as possible would make it manageable?

Hugh Robjohns wrote:A couple of hours, assuming nothing untowards is found and connectivity is straightforward, although it depends precisely what range of tests are required, and how many channels need to be tested, obviously.

I did not realize you were performing the measurements for all individual channels. So is the final figure you are giving an average for all the measured channels? Do you often measure significant differences between channels on a given device?

Hugh Robjohns wrote:There are different ways of testing things and different measurement standards. For example, I prefer to use A-weighting when performing AES17 dynamic range tests, but the AES standard actually calls for CCIR-2K weighting. The numbers will be different with different weighting filters, and casual readers may not appreciate why your tests give different numbers from a manufacturer's specs. A manufacturer may perceive this as damaging to their reputation and sales, and the more litigious will try and do something to prevent your publication.... Such situations can be defended, but it all adds to the administration and running costs... At the very least, you would need to be very sure of the legitimacy of your test procedures and calibration of your test equipment...

Thank you very much for this clarification, I understand better where the potential contentious points could reside.
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am

Re: Measurement based converters evaluation database

Postby Hugh Robjohns » Mon Jun 18, 2018 5:40 pm

Mr.Michaelz wrote:I did not realize you were performing the measurements for all individual channels. So is the final figure you are giving an average for all the measured channels? Do you often measure significant differences between channels on a given device?

If you don't test everything you're testing nothing! :-)

I don't average results, I give the 'worst case' values because that's the deciding /limiting factor in whether a device is fit for purpose.

And there's no point in a multichannel device having different specs on different channels-- if there are significant differences between channels something is probably broken. Having said that, though, on multichannel mic preamps it's not unusual to see slightly higher levels of mains hum on the channels closest to the power supply / mains transformer, and often the crosstalk varies between channels in multichannel gear depending on circuit board layouts or internal wiring arrangements.

H
User avatar
Hugh Robjohns
Moderator
Posts: 21552
Joined: Thu Jul 24, 2003 11:00 pm
Location: Worcestershire, UK
Technical Editor, Sound On Sound

Re: Measurement based converters evaluation database

Postby Mr.Michaelz » Tue Jun 19, 2018 7:38 am

Thank you very much for the explanations Hugh.
Mr.Michaelz
Poster
Posts: 19
Joined: Wed Feb 18, 2009 12:00 am


Who is online

Users browsing this forum: No registered users