Subscribe via: ( Email / RSS )

Monday, May 28, 2007

Founded in 1995, offers a vast online and print network to help job seekers connect with employers. In May 2006, comScore Media Metrix ranked as the nations largest online job site with more than 23 million unique visitors and over 1.5 million jobs. powers the career centers for more than 900 partners that reach national, local, industry and niche audiences.

In January 2006, launched, a site designed specifically for college students. To drive traffic to the new site, initiated outreach programs on college campuses across the country by running print ads in college publications and sponsoring a variety of campus events. Because of the wide variety of campus events and the considerable investment in on-campus outreach, was committed to tracking how successfully each kind of event drove traffic to the site. It was clear that an innovative approach to online tracking would be needed, one that could link offline events to online activity. To this end, created a pilot marketing tracking study using Google Analytics.


During the period of the study, sponsored career fairs, athletic and social events, and organized guest speaker events to give students an opportunity to learn about career opportunities. "All of these activities take time, and we need to figure out how best to focus our efforts, explains Nathan Lippe, collegiate marketing manager at "While the team is on the road, we want them to do whatever is most effective."

Using the Google Analytics Network Location report, Lippe was able to pinpoint the campuses from which was getting its traffic. used this information to run a study comparing traffic before and after events.

In the analysis, Lippe tried to determine which type of campus activity yielded the highest percentage increase in traffic. ran eight different activities on over 50 different campuses. For each activity, Lippe looked at the total number of visits for 30 days prior to the activity and 30 days after the activity. Two of the activities saw a 30 percent increase in traffic, while another saw a 20 percent increase in traffic. For competitive reasons, Lippe does not reveal the details of specific activities, "But, one of them was completely off the chart." To learn why, he decided to dig deeper into the campaigns data.

"A well designed methodology in combination with Google Analytics made our testing a success. Armed with what we've learned, I feel confident that we've developed the right marketing focus."

-Nathan Lippe
Collegiate Marketing Manager at

For the high-performing campaign, two print ads had been created to promote an on-campus event. The first ad ran two weeks prior to the event and the second ran one week prior to the event. Lippe looked at traffic over five distinct date ranges: the "baseline" period (30 days prior to running the first ad), the "promotion response" period (seven days after running the first ad), the second "promotion response" period (seven days after running the second ad), an "event response" period (seven days after the main event), and finally the "event lift" period (30 days beginning one week after the main event). Google Analytics was used to determine the average number of daily visits over each of these date ranges for three campuses.

Lippe saw that traffic increased from the "baseline" period by over 1000 percent to its peak during the promotion and event. "We had expected to see a spike in traffic," Lippe said, "but what really surprised us was that, for over 30 days after the event, there was a sustained lift in traffic. And we saw the same pattern across all three campuses." Indeed, for the "event lift" period, that is, the 30 day period beginning one week after the main event, saw an average daily traffic lift of 230 percent over the baseline period. It was clear that the promotions had been effective at increasing long-term awareness and usage of the site.

As Lippe further studied traffic for each of the date ranges, he noticed something unusual. Traffic spikes also occurred during the "promotion response" periods, not just the "event response" period. This told Lippe that something about the promotion was as important as the event itself. "We found that it was the incentive to go to the site, rather than the actual event, that really made a difference. With this knowledge we plan to run more incentive-based promotions since we know they work well," he said. "You can gain so much insight if you go beyond simply comparing traffic before and after the events."


This fall, continues their on-campus marketing activities. "Were planning to visit 80 more colleges this fall and we want to get the most value from the events we sponsor," Lippe said. The results we've seen with certain activities have become our new baseline. Were not just driving one-time visitors to the site; were getting students to learn about the site and use it as a resource. "A welldesigned methodology in combination with Google Analytics made our testing a success. Armed with what we've learned, I feel confident that we've developed the right marketing focus."


Post a Comment

Please note that we reject all posts that clearly are leaving a comment simply to acquire a back link. Only comment if you have something of value to share with other readers.


Copyright 2018. The Marketing Blog.