Dude… Would You Please Quit It With The Financial Markets Jargon Already?

Data-Driven ThinkingData Driven Thinking” is a column written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by David Soloff, CEO, Metamarkets, a publisher analytics company.

During the summer of 2008, I was having one of those NY business breakfasts – the kind of meal you can’t get anywhere else on the planet. It was like a $38 meal of poached eggs and French Press. I was meeting an old friend, a very smart hedge fund alum. I had this notion about capturing terabytes of media transaction data, and I wanted his opinion. The thinking went, media markets are utterly irrational, sellers don’t know up from down, velocity is picking up, fragmentation is in full blossom. And all the while, everyone is standing around shrugging their shoulders as to why the ‘big money’ was not moving off the sidelines and headlong into this new electronically traded media market. But I had it all figured out. It was clear as day to me: there was no market signal, no data. Every transaction formed its own market. And that was going to limit severely the development of market liquidity and keep dollars from flowing to these marketplaces.

My guy was a veteran of a couple of the most profitable and aggressive derivative desks in the world. He traded at the top, traded everything, every instrument, every market, every city. This guy would structure a trade around whether your car started in the morning, and make money regardless of outcome. Until one day my guy recognized that there was no Tums big enough that it could keep him in the game – basically he was gonna explode. So he retired at 34. This guy is a monster, a trading animal in a very expensive suit and tie, and very little gets past him.

It was my mistake to pitch him something so hare-brained. Guys like my guy are merciless when they sniff something that doesn’t make sense. My insane notion — that somehow, some way, terabytes and petabytes and exabytes of media transaction data provisioned across market channels for the benefit of market actors and the general enhancement of market liquidity – did not make sense to him. It was just too bloody expensive to capture this stuff, and to store it, he said. But I’m getting ahead of myself.

For the first part of the breakfast I had been talking about a positive vision of information markets, about data singularity, about data liquidity and financial instrumentation… And then I walk right into my friend’s buzzsaw. At this point I should say that I talk a lot. As a matter of fact, when the topic is something I know something about, I pretty much can’t shut up. So here I am going on and on about how the ads markets, the media markets, were ripe for standardization and normalization via the introduction of information products and market data services. Hadn’t that been the story of the financial markets? Isn’t that how market velocity and liquidity had emerged? Through standardized trade capture, screen based trading, market data services? Wasn’t it true that once these information services and screen-based trading products were at long last introduced to the market in the 1970s and 80s, there had been an explosion in transaction liquidity? Wasn’t this precisely because there was at last price discovery, a facility in risk management, you had derivatives, you had computers driving trading? There was a massive surge in the dollar volume traded… It was all so good. So capitalist. So electronic.

So here we were, my $58 poached eggs congealing as I talk and talk… I am coming to my point. I swear. I had this notion I wanted to bounce off him – somehow this idea for a market data service and data liquidity and risk management could be extended and expanded beyond the more traditional markets of financial instrumentation. Wasn’t it at last time to take this same approach beyond financial markets? Was he maybe interested in talking more about this with me? It was August 2008 and the S&P was breaking down a bit, sure, but we were still golden – Bear Stearns was a penny stock subsidiary of JP Morgan, but all in all that was a small price to pay. Wasn’t it time to talk about risk management for other markets, hedge-ability for all kinds of electronically traded asset classes? Sure there were cracks showing in the financial markets, but there was always definitely capital for this kind of thing, wasn’t there? There was always going to be an appetite for businesses dealing in petabytes of PRIMARY ECONOMIC SIGNAL, MAN….

I think at this point in the meal I was probably standing up, launching into an impromptu whiteboard session. I was passionate about this idea at the time. There could be data and information flowing from one terminal to the other. There could be indexing, real-time valuation, market dashboarding. “Bro, we are gonna have a single electronic media market – there will be a central clearing registry, there will be market data services, there will be traders and brokers arrayed on both sides of that marketplace. We are coming nearer to that day when the buyer and seller can direct connect by virtue of clearing and information services. We are going to wipe away these false distinctions between a spot market or an upfront market – data is going to liberate the sellers. They are going to that promised land: optionality. They are going to have the data and routing sophistication they need to maintain optionality. And it’s going to juice their margins. They are going to be like a prop desk that is sitting on a big position and that knows where all the buyers are and what they are willing to pay. And it’s all going to be fueled by my information products.”

When he got up to go to the men’s room, I swear he was smirking. I ate my cold $148 eggs. When my friend returned to the table, he dropped the bomb. “It’ll never work.” He meant it too.

“Nobody can afford to capture this data. Even if you somehow convince these guys to let you sit on their stream, even if it is actually signal, the price point it’s reflecting is infinitely small. Nobody can build the infrastructure they need to capture, process and store this stuff, not data that fine-grained. No way. You’ll need millions of dollars per month just to store and process the stuff. Your db license is going to cripple you. Guys on the Street can do this cause they are trading hundreds of millions of dollars at a clip. You can’t do this when it’s thousands of dollars per trade. It won’t work. It’s interesting but it won’t work.” Then, the kick in the teeth. “One other thing. I’m a trader, and I’m your friend. So you should listen to me. Dude. Would you please quit it with the financial markets jargon already? Financial markets work because of the capital that trades through them, and because of the capital that is spent to operate and maintain them. Your market will never be able to afford this infrastructure.”

He was right, at least that morning in August 2008 when it cost about $1250 to store a TB of data. Nobody had heard of AWS S3. Greenplum licenses were running $500,000 to start. Few were pipelining in PIG, running front-end viz off open source javascript libraries…  Apache was just starting to codify and open source HBase based on the Google Big Table paper. I didn’t know it as I rode the subway to my morning meeting, but it was the dawn of a new day… The markets crashed about 3 weeks later, and hedge fund kids started starting startups.

The past couple of years have brought into focus that we are experiencing an “Attack of the Exponentials”: accelerating collapse in storage cost; accelerating collapse in CPU cost; collapse in data transfer cost; surge in network nodes.  I think all that financial markets jargon is finally becoming relevant. Especially when storage has collapsed to $70 per month per TB, down from $1250 in just under 30 months.

“Take that. Dude.”

Follow David Soloff (@davidsoloff), Metamarkets (@metamx) and AdExchanger.com (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. When you bury the lead this deep in the article, you get the reader filling in the blanks long before you get to the point, and in my opinion, that can be a good thing.

    Your punch line was exponential decrease in cost of storage, CPU and date center costs and a surge in network nodes.

    With my background I was just thinking about the “network nodes” from the beginning, and the nodes I was thinking of are the hundreds of millions (soon to be over a billion) humans beings who are are putting their thoughts, ideas and actions in a searchable form each month . . . i.e. social media.

    Several months back the New York Times ran a piece on Wall Street and other trading companies using a variety of public data to inform their computer run high speed trades. You can find that article here:


    I know the NYTs was just catching up to what trading programs have been doing for years, but it was informative to me to see they were using sentiment analysis of tweets to inform their programmed trades.

    I have a history in information arbitrage across very small cash based markets, so I understand the power of information to inform lucrative positions. And more importantly, I understand that being able to mine a large database of past and current data, can inform trading actions in a meaningful way.

    My current interest finds me doing agency work for large companies that are interested in social media. And I use this sentiment trading example to explain the importance of social media monitoring, sentiment analysis and the amazing growth that we will see in this area over the coming years.

    I don’t know if you allow links, but here was my take on trading on sentiment analysis:


    Roger Ehrenberg pointed me to that first NYT’s article (via Twitter) so it is not surprising that I found this article from him as well.

    Always excited to read thoughts and developments in this area. David, thanks for the article.

  2. Doug Conely

    I have been making the point about the industry being too small to support the infrastructure costs of all these platform approaches for a while. As you say, though, that’s probably no longer true.

    However, there are three challenges remaining beyond this to the financial analogy.

    Firstly, I still believe there are too many venture backed companies building tech in this space relative to the size of the opportunity. To make money with exchange/ tech platform economics you need to drive very high volumes. Not everyone can be a winner so many of those investments will be wasted while driving down pricing for those that can survive. Financial markets were far less fragmented in the 70’s and 80’s than media today.

    Secondly, there is a still a major challenge with system integration. You either have the choice of building out a platform yourself, like we have at Tribal Fusion, or getting very, very good at integrating best of breed solutions. I still haven’t seen anyone particularly impressive on that point – happy to hear otherwise though.

    Thirdly, I just don’t believe that there’s enough money in the industry to hire and pay the kind of smart people needed to build these systems, analyze results and make fast decisions. Are there any network or media trading desk analysts making a million dollars a year? No.

    I love the science and technology at work here – and it’s going to be a part of the industry at the margins – but media is still going to be a people business for a long time to come.

  3. Nice piece David – and great insights. I had a very similar conversation almost two years ago with a friend who was a quant in prop trading on the Street and he showed a lot of interest that was somewhat dulled by the small amount of money at stake. The issue here is that the “impression” is an exceedingly tiny transaction: but what it really is getting at is the ultimate thing we are all fighting for in the media business (whether we use tech to get there or creative, or something else) which is attention.

    Attention is the ultimate human currency: and around it everything flows. Is all of it addressable? Not really. Advertisers do their best to insert messages in the world that can catch the eye or ear of enough human beings around to make a difference but most of our waking hours we are thankfully not exposed to advertising. (BTW for you geeks, a computer comparable to the human brain would need to be able to perform more than 38 thousand trillion operations per second and hold about 3,584 terabytes of memory according to Dharmendra Modha of IBM)

    These little time slices when treated as they are completely disaggregated aren’t very valuable. They become much more valuable when they are understood relative to one another for a single individual. Even assuming that you only get to advertise in the places you do today, I believe that 60-75% of all advertising could be eliminated if there were better coordination between all the different media opportunities relative to the user. The average online user in the United States is worth just 36c a day in ad revenues today ($6.5 billion / quarter / 90 days / 200mm users). If you assume they see about 120 ad impressions or so per day on average (which is my own number but probably ballpark) then we’re talking about a $3.01 CPM on average (this is search and display combined).

    So my question is, (online) instead of all this complicated stuff we are all working on, why not just figure out one or two messages you want someone to focus on in a single day, and get rid of all the other ad clutter junk we’re trying to endlessly analyze in minute time slices?