The purposes of the "equalization curve" in record manufacture are to permit greater recording times, and to improve sound quality.
A large bass note would require a large movement of the stylus leading to tracking issues. A lot of high frequency energy would also cause sonic issues on playback.
So to make the LP system work effectively in real world use, engineers used a "trick" to cut the bass frequencies and boost the treble during cutting and then restore the amplitudes to level during playback with an equaliser working opposite to the encoded signal.
The Record Industry Association of America or RIAA promoted an equalization curve that was intended to operate as a global standard for records since 1954.
Before then (1940's) each record company had similar equalization technologies and it is generally accepted there were over 100 combinations of turnover and rolloff frequencies in use. The main ones being Columbia-78, Decca-U.S., European (various), Victor-78 (various), Associated, BBC, NAB, Orthacoustic, World, Columbia LP, FFRR-78 and microgroove, and AES.
But Michael Fremer the Senior Editor of Stereophile magazine and a respected journalist, researcher and industry expert has met with and spoken to retired engineers who were actually there during RIAA curve adoption and acceptance.
He has reliable evidence from studio notes, engineering specifications and training manuals from the era which show "pre-stereo curves" such as the Capitol, Columbia and FFRR on stereo records were used as TONE CONTROLS on playback equipment to make a particular album sound 'better' but that the LP was cut in the studio to the RIAA "correct" playback curve!
You can read the full account here http://www.analogplanet.com/category/can-i-have-my-money-back
Michael ASKED Ron McMaster as he sat at the mastering board in Capitol's mastering suite. As for London/Decca you can just LOOK at the jackets. They say "use the RIAA curve" but if that's not enough for you, I asked veteran Decca mastering engineer George Bettyes who mastered between 1957 and 1972. If you have a London/Decca record whose lacquer number is followed by the letter "L" George Bettyes did the cut.
He told me IN NO UNCERTAIN TERMS that there is NO SUCH THING as the "FFSS" curve. "FFSS was a marketing tool like 'Living Stereo' or 'Living Presence'. It is NOT an EQ curve." He also insisted that Decca used the RIAA curve. On another subject of contention he also told me that Decca and London records were identical. I was sent a production order for a classical title that backed up that claim.
As for the notion that the Columbia curve used in the early LP days for mono LP playback (and useful for those records today) somehow continued being used in the stereo era, this too is simply NOT TRUE. I received this from someone who was there:
"When Sony closed 54th Street, we discovered a large amount of old Columbia documentation. Binders full of memos, schematics, etc... I've taken it upon myself to scan all of these documents for posterity. I came across one that answered the burning RIAA/Columbia curve question from a few months ago.
It was a document from William (Bill) Buchman, Director of Electronic Engineering and Research. He states some general facts about the Columbia curve and how the competition has tried to alter their curves to mimic the Columbia curve. This has forced a standard to arise, the RIAA curve, which is identical to the NARTB standard.
He plots a graph to show the similarities and discusses the differences between the two. He goes on to state that the differences in production can account for a greater difference than the difference between the two curves (italics mine).
And here's the kicker...He says the RIAA curve is ideal for playback of Columbia LPs and that a gradual change over to RIAA should be carried out without distinguishing the differences between the RIAA and Columbia pressings.
This document is not dated, but it's wedged in between a memo from 1955 and 1956.
I feel this is the document which clarifies that Columbia dropped their curve in the mid-50's and quietly gravitated toward the RIAA curve and was not using the proprietary curve in the 70's."
Yet there are still deluded individuals claiming that Columbia used its own curve into the '70's and '80's even though at that point much of Columbia's cutting was farmed out to independent mastering houses, which means these deluded individuals are basing their conclusions on the COLUMBIA LABEL being slapped on the record.
And then there was this, which came directly from a veteran Columbia mastering engineer:
I can absolutely, positively say, there was no "Columbia curve" (in use during the stereo era). I suggest whoever came up with this BS should stop smoking crack and get down to earth, take a walk in Central Park and smell some fresh air. Is this the case of another clueless as...le trying to write the book about the music business, while working at the Good Year Tire Center changing oil full time? Tell that sh...ck Columbia was doing the same thing every other studio was doing: using RIAA curve, period. This has to be the most ridiculous crap I have ever heard."
The review goes on to claim that Vanguard, Motown, Pablo, Prestige, Impulse! Roulette and other labels require specific curves.
Vanguard and Motown records were originally mastered by RCA. There is ZERO difference between an original Vanguard and an original Motown and an RCA "Living Stereo", which absolutely positively used the RIAA curve. Second Vanguard pressings (orange label) were mostly mastered at Columbia. The paper label color means ABSOLUTELY NOTHING. You cannot decide the appropriate curve based on the "label". Pablo was also manufactured and distributed by RCA but many Pablos were mastered at Kendun. Check your Pablos. Again the label means NOTHING.
Prestige, Impulse! and others listed in this review were mostly mastered by Rudy Van Gelder ("RVG" or "Van Gelder"). His lathe cut Blue Notes too. The only difference is the label art. How can label art determine EQ curve???? All were cut using the RIAA curve. Roulette? Mostly cut at Bell Sound. RIAA curve too. Interesting lathe according to lathe guru Sean Davies but RIAA.
Prestige OJC series were cut by George Horn at Fantasy. They were cut using the RIAA curve not some "fantasy" curve for Prestige records." End quote.
As you can see RIAA was the most popular and accepted standard. Buying a Phono Stage with all the other curves is a fun way to alter the sound after the fact. That is for the audio buyer to decide and if that is important to you then there are a number of brands on the market that offer tone controls.
Our brands are RIAA adherents and do not offer tone controls. Custom builds are available at additional cost to support any curve.
DMM (Direct Metal Mastering) was co-developed by Teldec and Neumann as an "improved" vinyl record manufacturing technology.
The idea was to bypass the lacquer to metal plating process and take out a further series of steps in the manufacturing process, thus preserving more of the source.
The cutting lathe used for DMM engraves the audio signal directly onto a copper-plated master disc, instead of engraving the groove into a lacquer-coated aluminum disc.
A lot of experienced audiophiles have lamented that DMM records can sound edgy and bright in some instances.
Examination of early DMM discs revealed what appeared to be a high frequency modulation in the groove, thought to be caused by the use of an ultrasonic carrier tone. In fact, there was no carrier tone and the modulation was simply caused by the vibration (squeal) of the cutter head as it was dragged through the copper disc.
These cutting head resonances also occur in lacquer cutting machines and are dependent on the choice of cutter head design. Ortofon heads resonate differently than Westrex or Neumanns or new modern day NIB magnet aircooled heads. These are mechanical resonances excited by the music signal being amplified through the electromechanical coils and suspension systems which move the torque tube where the diamond or sapphire cutting tip is attached.
Engineers use various methods to reduce these "head" resonances including injecting small 180 degree "counter" tones to cancel the majority of the resonances when they occur. This is part of the "black art" of the engineer and why good engineers are in demand worldwide for their ability to "drive" a "lathe".
These counter tones may have given rise to the story that a "carrier" tone is used on DMM.
So there are great sounding DMM pressings and some not so great. It is a technological advance and more difficult to process. Copper plates also are more expensive than aluminium and acetate lacquer alternatives.
Records manufactured with this technology are often marked by a "DMM" logo on the outer record sleeve.
The Direct Metal Mastering technology is also thought to address the lacquer mastering issue of pre-echoes during record play. This is caused by the cutting stylus unintentionally transferring some of the subsequent groove wall's impulse signal into the previous groove wall. In particular, a quiet passage followed by a loud sound often clearly revealed a faint pre-echo of the loud sound occurring 1.8 seconds ahead of time (the duration of one revolution at 33 rpm). This problem could also appear as post-echo, 1.8 seconds after a peak in volume.
The "jury may be out on this" as tape enthusiasts point to magnetic bleed through on tape storage as also causing this problem. Another discussion suggests this problem could be caused by the pre-read head on the mastering tape machine having some crosstalk into the cutting head. It probably is enough to start a whole FAQ on this topic alone.
Another DMM improvement is noise reduction. The lacquer mastering method bears a higher risk of adding unwanted random noise to the recording, caused by the enclosure of small dust particles when spraying the silvering on the lacquer master, which is the necessary first step of the electroplating process for reproduction of the master disc. As the DMM master disc is already made of metal (copper), this step is not required, and its faults are avoided.
DMM was popular during the 1980s and seems to have fallen out of favor in most mastering houses, with lacquer cutting preferred over copper. Europadisk, one of the last DMM facilities in the U.S. closed down in 2005.
Abbey Road had or perhaps still has a DMM cutting system but they too switched over to lacquer.
In general terms DMM offers a more precise sound, with sharper transients and better image "edge definition," while the lacquer cut seems smoother, warmer and more pleasing on the ears.
Which was more "accurate"? There are many technical reasons why DMM is but given the huge catalog of new vinyl is mostly made on older lacquer mastering systems it might be a moot point except for the collectors who seek out old DMM's or rare new DMM releases.
So there is a fuss about DMM. Seek out a copy and try to find an equivalent pressing mastered on acetate to compare. It can make for some very interesting listenting sessions.
You will find contentious, stimulating, thought provoking, sleepless night (generating and curing ;), barneys, blues, fisticuffs and funny stuff only this hobby can generate.
We will likely be wrong, or right, or a bit each way, but most assuredly respectful of all walks of life and open to divergent ideas and thoughts.
"Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.
well it depends on who you ask?
Dallas Clarke says "An electromagnetic transducer of any sort is not by nature committed to a balanced or unbalanced configuration until such time as it is wired one way or the other.
Dynamic microphones typically have an XLR receptacle and will operate balanced when connected (XLR to XLR) to a mixer with transformer-coupled inputs, or unbalanced when connected (by an XLR to 1/4" jack) to an instrument amplifier because the head-lead ties XLR pins 1 & 3 together, thereby grounding one end of the coil.
Precisely the same principal applies to phono cartridges... both ends of both coils are connected to discrete wires within the arm tube - typically, five terminals or a five-way connector is present at the base - the decision to unbalance the signal is therefore made by the head-lead - just as it is with a microphone cable. The fact that 99.9% of phone stages have two RCA receptacles ensures that a similar proportion of head-leads will be configured this way.
There is no electrical reason that a phono cartridge cannot be configured as a balanced transducer, provided that one can obtain (or fabricate) a head-lead using two TSP (twisted, shielded pair) cables terminated by a suitable 3-pin connector to a suitable transformer-coupled or virtual-earth instrumentation amplifier. The fact that this configuration is comparatively uncommon does not make it impractical and it is potentially much quieter due to common-mode rejection. Effects of cable loading, capacitance and 'tone' however could open another whole can of worms and doubtless spawn all manner of bumf in the blogisphere!
Where a lot of blogs go off the rails is in the assumption that the shield of the RCA cable is:-
A. functionally equivalent to the 'cold' signal side of a TSP cable.
B. not commoned with the other channel and the arm-tube ground at either or both ends.
When dealing with low-level balanced signals and high-gain stages, TSP is the only sensible choice.
A few esoteric phono stages do support this type of operation - the Thrax Orpheus is one of them."
Wally Malewicz from WAM Engineering says"
"Are cartridges “balanced” devices? Yes, in the sense that they are not referenced to ground, though depending upon the tone arm wiring arrangement yours may or may be referenced to ground. The cartridge itself is “balanced”: that is, the red and white pins send the positive phase of the right and left channel’s signal and the green and blue ones the negative phase for each channel.
If your tone arm wire preserves this arrangement, the RCA plug’s pin carries the positive half and its ring carries the negative. A separate ground wire (or a pair) allows the option of tying the negative phase to ground or not.
When the tone arm wire terminates with RCA plugs and the phono preamp jacks are RCA, the negative connects to ground in the phono preamplifier. The voltage on the ground side is an unvarying 0. The voltage on the plus side is the tiny cartridge voltage to be amplified and of course it varies with the musical signal. The phono preamplifier amplifies the difference between the varying voltage and ground."
We'll let you ponder on this a little (yep you will lie awake trying to figure this out).
We'll add some interesting comments on this soon.
I think its akin to some of those crazy critters found in nature. Those that decide under a full moon who they want to be to make babies.
So some say it is a single ended device. Others demand it is XLR balanced and claim that the XLR connector makes all the difference.
Hmmmmm - got me thinkin!!!
© 2019 Telos Audio Distributors | Ph +61 419 830 587 | Contact Us