According to Incapsula.com's 2015 Bot Traffic Report, 48.5% of all web traffic is fake.
And if that's the case, what do you think this could mean for the accuracy of your tracking, your stats, your split tests, and ultimately your profits?
Just let that sink in for a minute ...
Now obviously this is just an average and not every source of traffic to your site or your tracking links is going to be 48.5% fake, but it's almost always more than you'd think.
Honestly, for "small" sites like ClickMagick the percentage of bot traffic is usually even higher because bots don't generally favor big and popular sites the way people do.
That means small sites get most of the same bot traffic that big sites get. They just get less real human visitors, so the overall percentage of bot traffic is even higher.
I always wondered why people in our industry don't make a bigger deal about this ...
And then after launching ClickMagick and talking to many of our customers it became immediately obvious that most people just have no clue how pervasive bots are.
For example, did you know that any time you share a link on Facebook or Skype it will be immediately "clicked" by several bots?
In this case they're Facebook and Microsoft bots that are just checking to make sure you're not sharing links to banned content and that type of thing ...
... but this is just one simple example, and the point is that there are literally millions of bots "behind the scenes" doing all sorts of things you can't even imagine.
The whole bot problem is actually even worse than it sounds on the surface.
Because not only do bots artificially inflate your click counts, but they can also generate fake opt-ins, and if you're really unlucky, even fake sales believe it or not.
Bots wreak havic when it comes to tracking.
Off the top of my head I'd say that around 15% of the support tickets we get at ClickMagick are due to confusion caused by bots.
For example just the other day we received a support ticket from someone who swore ClickMagick was "broken" because one of their new tracking links showed 12 filtered clicks but they hadn't even started promoting the link yet.
Turns out they had given the link to a handful of different people on Skype and FB.
But the idea that bots had generated the "clicks" was so foreign to them that they still had a hard time believing it even after we explained it, and even proved it to them.
So what exactly are bots anyway, and how can you fight back? Let's talk about this ...
A bot is any type of automated system or software that accesses your links.
They can be both innocent like a search engine spider or a link monitoring system, or they can be malicious like a content scraper or a script used to generate fake clicks - but either way they seriously screw up your stats.
The thing is that most tracking systems can't identify these bot "clicks", so they treat them just like a real click from a real person, and that's a real problem.
Really think about what this means for the accuracy of your split tests for example.
You could be leaving a lot of money on the table due to skewed stats caused by bots, and all the bad decisions you've possibly made based on this bad data.
Because bots do some crazy things - for example requesting the same page 50 or 100 times in a minute or two - and this type of thing can literally render your split test results almost completely meaningless.
Even the "best" and most popular tracking and analytics systems like Google Analytics do a really poor job dealing with this problem.
Google Analytics has a setting you can turn on to filter clicks from "known bots and spiders", but the source of their data is the IAB/ABC International Spiders & Bots List, and let's just say this list covers only a small percentage of what's really out there.
And yeah, if you're running PPC ads on Google, Bing or Facebook I can assure you that you're paying for fake clicks.
I bet my life on it, and this is actually one of the reasons I created ClickMagick.
Obviously I can't promise anything specific but, armed with the data that ClickMagick provides, our users are constantly getting refunds from ad networks for bad clicks.
See, at the very core of ClickMagick is our traffic filtering and "cleaning" technology that carefully analyzes each and every click that flows through the system.
A few other trackers claim to filter out "known bots", but what good does that do if their system only knows about 5 or 10% of the bots out there?
With other systems bot filtering is just an add-on. An afterthought.
But ClickMagick was built from the ground up to do a better job of filtering bots and other bad traffic than anyone else, and that's exactly what it does.
ClickMagick has over a dozen different monitoring systems that constantly monitor and analyze not only every detail of every click, but actual click and session "behavior", conversions, and lots more we just can't talk about publicly.
This allows our system to do something that most others can't, and that's to identify all of the bots that don't properly identify themselves like good bots do.
See, bad bots try to fly under the radar by disguising themselves as legitimate web browsers instead. And that's why most other trackers only filter out a small portion of bot clicks - because they only know about the good bots that identify themselves.
But once ClickMagick identifies a potential new bot, it's tracked and monitored for a period of time to determine with 99.99% accuracy whether it really is a bot or not ...
... and then if it is ClickMagick can begin to filter this new source of bad clicks, and all of our users' traffic gets just a little bit cleaner, and their stats get even more accurate.
Between our automated monitoring systems, lots of human oversight on our end, and input from our users, ClickMagick's filtering system actually gets smarter and more accurate each and every day.
There's nothing else like our traffic filtering system, and I'm quite proud of it.
You might think this sounds like overkill or it's unnecessary, but really ...
... what's the point of tracking anything if your stats aren't accurate?
This is such a no-brainer that I'll actually make you a promise right now:
If you want to give ClickMagick a try, I guarantee it'll pay for itself just in the fake clicks and click fraud that it detects for you automatically.
If you don't find that to be true just let me know and I'll personally send you back twice the amount of money you paid for your ClickMagick account.
It hasn't happened yet but that's how confident I am in ClickMagick's ability to clean your traffic and provide you with the most accurate stats possible.
If you don't use ClickMagick, I really don't have many other recommendations ...
Simply because there really isn't much else I can recommend that you do.
If your tracker filters "known bots and spiders" that's a start and better than nothing.
Other than that, about the only thing you can do is manually comb through your raw click logs and try to identify suspicious clicks and IP addresses, and then manually investigate each and every one to look for patterns and things like that ...
And that was painful just to think about and write.
Either way, at least now you're aware of the problem ...