Automatically exclude robots or 0 secondes visitors
Even better to put on separate page.
In version 3.4.4 bot filtering was added, now all crawlers detected as bots using the UserAgent string are automatically excluded from tracking.
There might still be 0 second sessions (users that leave before the page fully loads) or some custom bots/crawlers that don’t set a fake UserAgent. You can still filter those sessions using the Session Length filter in a segment.
I'm testing with three analytics systems (userTrack, Google Analytics and other session recorder and analytics system).
I think userTrack is better than the other analytics system but the other system shows a more accurate visitor statistics (similar than Google Analytics) because it doesn't include 0 seconds sessions nor crawlers or other robots.
I can't publish here images (if you want, I can send you captures of graphics by email to show the difference).
I have tested two websites: one with an averague of 200-300 visitors/day and other with 0-3 visitors/day.
In the web with 200-300 visitors per day I can see the other analytics system showing a little more visits per day (normally 5-20 more visits). userTrack shows in some days 100 more visitors than Google Analytics and the other analytics system.
In the web with 0-3 visitors per day, the differences are even bigger. While the other two systems shows 0-4 visitors, with a little differences, userTrack show some days 40-50 visitros extra. My website are in Spanish, and this visitros are from USA with a 0 seconds sessions or, sometimes, with a uno minute or two but without movements (this session replays ends instantly).
I repeat that I consider your software the best in recording session, heatmaps and analytics in general. I love your software and I use it for all my private network. But I believe the only problem with it is the inclusion of robots in the general stats. I believe that filtering it and not showing in the stats would be amazing. Or, if your prefer it, showing other separate stats for robots. But I think, is sufficient to filter and not count robots and 0 seconds duration visitors.
I love the main page showing me all my websites stats at the same time, but this extra fake visitors damage the numbers and confuse me.
Thanks for your awesome software and support!
David Levi commented
Was this ever addressed? I would very much like to see this.
While the data might be useful, it clogs up a lot of the more relevant data.
I don't think that completely excluding bot/fake visitors is a good idea. Sometimes they might not even be bots, but visitors that didn't wait for the page to completely load before leaving.
Now, having them on a separate tab/page might be useful. I think a better way might be to simply add a filter to the clients list to hide/show visitors with 0 data.
What do you think?