Ad Rate= 34(best demo)+83(demo*age factor)
- Best demo = highest rated out of A18-49, M18-34, W18-34
- Age factor = #A18-34/#A18-49
The constants were figured out by Omabin, and I don't have an explanation for how those were found.
Using this formula, I calculated my own more convoluted formula. There are more constants, and I did not take for granted any of the decimal places--meaning that my formula is NOT rounded. Here it is:
Ad Rate= 34(best demo) + 83(average demo)((.6612808381*A18-34)/(1.2696112*A18-49)(.986)
How did I get these seemingly arbitrary constants? Well, there are 126,960,000 A18-49 who live in a household with a television as of fall 2013 (a little outdated but its the best I’ve found so far), and there are 136,400,000 A18-49 who live in the United States. This means 93.08% of Americans in the key A18-49 demographic live in a home with at least one TV. Advertisers can’t possibly be marketing to people who don’t have a TV, so I think it is safe to take this #18-49 from Omabin’s formula and multiply it by
.9308 (or .931, or .93, depending on how precise we want to get). The result will probably not be much different, but it may reduce the percent error.
Also as of that 2013 Nielsen study, there were 67,630,000 A18-34 who live in a household with at least one on television, which is 90.17% of the population. So we can multiply that number (.9017,
.902, or .9 depending on how exact we want to get) by that #18-34 in Omabin’s formula. Based on this data, the numbers were calculated using a little magic called arithmetic. The final constant, .986, is based on the percentage of adults between the ages of 18-49 who are reported to own at least one television.I think a great example of what I was trying to capture came through Chicago PD. In order to predict the ad rates for 2014, I needed to use data from the 2013-2014 season so that I could compare it with the actual fall 2014 figures. Here is what I computed:
(34(2.3) + 83(1.58)((.6612808381*1.09)/(1.2696112*1.6)(.986))= 124.08
This means the expected ad rate based on my formula was $124,080. This is off just slightly from the real ad rate, $121,400.

Let me give you another example--Nashville. Through my formula, I first predicted that it's ad rate would be $137,607.2379692 (yes, I kept it that exact). However, something stood out to me: the drop
in raw ratings between Nashville's first and second seasons was 21%. Surely ABC couldn't have expected the show to be steady after such a drop that season, right? So, if I multiplied the result by .79, the calculated ad rate would be $108,710, very close to the actual $104,511. Once again, I still can't explain the statistical significance of this, if there even is one. These results may simply be a fluke, but it should be noted that when factoring in an expected year-to-year increase or decline, the results are pretty similar.
HOWEVER, this formula is far from perfect. It has a trouble time predicting shows like Family Guy, which in this past season had a season high of a 4.6 A18-49 rating and a low of a 1.2 rating. It also has trouble predicting CW shows. I also noticed that ABC's ad rates were strikingly similar to the shows' relativeness to the scripted average, with an average show around $100,000--but that is just a trend I noticed that would need much further study in order to draw a real conclusion.
Anyways, that's just a very quick glimpse of what I have been working on. If you want more, leave a comment with your inquiry or email me at theratingsjunkie@gmail.com.