Tag Archives: linkedin

Web Log Storming: up to 40% competitive discount

In addition to an educational 30% discount, we have just announced a competitive discount of 20% or 40% (depending on product). We believe that this could be a nice opportunity to either switch to Web Log Storming or to use it as an additional analytics tool. You just need to send us a screen shot with a proof – it could be a picture of the About box with your name in it (if it’s a desktop solution) or a picture of a web page for a hosted solution.

For paid packages (either desktop or hosted) the discount is 40%, which means that you can get new Web Log Storming license for only $113.40 (US).

And this could surprise you. The discount is currently available even if you use free analytics package, but with an additional condition: you must use it for at least two months (make sure you set date range accordingly when taking a screen shot). In this case discount is 20% and your price would be $151.20 (US).

The offer is also available for the upgrade from old version of Web Log Storming ($47.40 / $63.20).

Visit Web Log Storming website for more information

10 reasons why web log analyzers are better than JavaScript based analytics

In this article we are going to point out some objective strengths of web server log analysis compared to JavaScript based statistics, such is Google Analytics. Depending on your preferences and type of the website, you might find some or all of these arguments applicable or not. In any case, everyone should be at least aware of differences in order to make a right decision.

1. You don’t need to edit HTML code to include scripts

Depending on how your website is organized, this could be a major tasks, especially if it contains lot of static HTML pages. Adding script code to all of them will surely take time. If your website is based on some content management system with centralized design template, you’ll still need to be careful not to forget adding code to any additional custom pages outside this CMS.

2. Scripts take additional time to load

Regardless of what Google Analytics officials say, actual experiences prove otherwise. Scripts are scripts and they must take some time to load. If external file is located on a third-party server (as it’s the case with Google Analytics), the slowdown is even more noticeable, because visitor’s browser must resolve another domain.

As a solution they suggest putting inclusion code at the end of the page. Indeed, in that case it would appear that page is loaded more quickly, but the truth is that there’s a good chance that visitor will click another link before script is executed. As a result, you won’t see these hits in stats and they are lost forever.

3. If website exists, log files exist too

With JavaScript analytics, stats are available only for periods when code was included. If you forget to put code on some pages, the opportunity is forever lost. Similarly, if you decide to start collecting stats today, you’ll never be able to see stats from yesterday or before. Same applies to goals: metrics are available only after you decide to track them. With some log analyzers, you can freely add more goals anytime and still be able to analyze them based on log files from the past.

4. Server log files contain hits to all files, not just pages

By using solely JavaScript based analytics, you don’t have any information about hits to images, XML files, flash (SWF), programs (EXE), archives (ZIP, GZ), etc. Although you could consider these hits irrelevant, they are not for most webmasters. Even if you don’t usually maintain other types of files, you must have some images on your website, which could be linked from external websites without you knowing anything about it.

5. You can investigate and control bandwidth usage

Although you might not be aware of it, most hosting providers limit bandwidth usage and usually base their pricing on it. Bandwidth usage costs them and, naturally, it most probably costs you as well. You would be surprised how much domains (usually from third-world countries) poll your whole website on a regular basis, possibly wasting gigabytes of your bandwidth every day. If you could identify these domains, you could easily block their traffic.

6. Bots (spiders) are excluded from JavaScript based analytics

Similar as previous point, some (bogus) spiders misbehave and they are wasting your bandwidth, while you don’t have any benefit from them. In addition, server logs also contain information about visits from legitimate bots, such are Google or Yahoo. By using solely JavaScript based analytics you have no idea how often they come and which pages they visit.

7. Log files record all traffic, even if JavaScript is disabled

Certain percentage of users choose to turn off JavaScript, and some of them use browsers that don’t support it at all. These visits can’t be identified by JavaScript based analytics.

8. You can find out about hacker attacks

Hackers could attack your website with various methods, but neither of them would be recorded by JavaScript analytics. As every access to your web server is contained in log files, you are able to identify them and save yourself from damage (by blacklisting their domains or closing security holes on your website).

9. Log files contain error information

Without them, in general case, you don’t have any information about errors and status codes (such are Page not found, Internal server error, Forbidden, etc.). Without it, you are missing possible technical problems with your website that lower overall visitor’s perception of its quality. Moreover, any attempt to access forbidden areas of your website can be easily identified.

10. By using log file analyzer, you don’t give away your business data

And last but not least, your stats are not available to a third-party who can use them at their convenience. Google has bought all rights for, at that time, popular and quite expensive web statistics product (Urchin), repackaged it, and then allowed to anyone to use it for free. The question is: why? They surely get something in return, as Google Analytics license agreement allows them to use your information for their purposes, and even to share it with others if you choose to participate in sharing program.

What could they possibly use? Just to give few obvious ideas: tweaking AdWords minimum bids, deciding how to prioritize ads, improving their services (and profits) – all based on traffic data collected from you and others.

Related links

Busting the Google Analytics Mythbuster
Which web log analyzer should I use?
What price Google Analytics? (by Dave Collins)
Web Log Storming – an interactive web log analyzer

Accounting software for startups – anyone interested?

There’s really lot of accounting/finance software available, free or commercial, and creating another one seems very unnecessary. Still, most of them are targeted to either home users or to large companies, even though some claim they are intended for small businesses. Alright, if those are for small businesses, what about micro-businesses? Being a small (micro?) software company (moreover: located outside USA, UK or Canada), we really don’t need most of options that these applications offer while not fulfilling other needs that we have.

After, literally, years of searching, we finally decided to make a tool for ourselves – to make it actually usable for micro ISVs and other service-based companies. We believe that there’s more company owners out there who feel the same and who currently use spreadsheets to track finances or trying to adjust already available tools. That’s why we are seriously thinking about polishing this software and publishing it as a product.

What do you think? If you are interested in this kind of application, please go to Fresh Flow Accounting website where you can read a short introduction, subscribe to a mailing list or send us a comment to support us in building this or tell us how stupid we are. πŸ™‚

Fresh Flow Accounting website

Busting the Google Analytics Mythbuster

In the recent article at Google Analytics Blog author tries to bust several myths circulating in the public. You can find few half-truths and (intentional?) deceptions there that I simply can’t ignore.

As I mentioned earlier, Google Analytics could be a nice addition to the main analytics solution (even we are using it occasionally), providing that you don’t mind the baggage that comes with “free” label (What price Google Analytics?, Google Analytics – is it worth its price?, Google Analytics is not free, or search Google πŸ™‚ for more). JavaScript based systems give some information that log files can’t, but they also suffer from several drawbacks that are limiting the value if used alone.

As each product has its own audience, I won’t question anyone’s decision to choose one type of solution or another, but some things simply must be said, regardless of how many people will read this compared to the original article. πŸ™‚

MYTH 1: “You get what you pay for.” Google Analytics is free, which means the system is down a lot.”

I do agree that GA system is not down very often (if ever). Why it would be? They have more than enough resources to keep it alive, and imagine how much data they would lose in just one minute of downtime. But no matter how powerful their servers are your website will inevitably be slower. I doubt that you’ll find this particularly alarming, but still…

MYTH 4: Google Analytics is not really accurate

Google Analytics uses JavaScript tags to collect data. This industry-standard method […] discrepancies greater than 10%, it’s due to an installation issue. Common problems include JavaScript errors, redirects, untagged pages and slow client-side load times.

[…]

All web analytics tools face the same technical limitations posed by JavaScript tags […]

Ouch. This one is a main motivation for me to write the article. I’ll just comment phrases in bold (in the order of the appearance).

  1. JavaScript tags are just one of methods used today. Even if we ignore custom in-house systems (based on whatever web developers use: PHP, ASP, Python, Ruby on Rails, …), pretending that still widely accepted server log file analysis don’t exist is at least an intentional delusion.
  2. Expected discrepancies of 10% or below among JavaScript based analyzers could be true, but compared to log file analysis, they show 2 to 5 times less traffic.
  3. It could be the installation issue only if visitors can be tracked with JavaScript. What about other traffic?
  4. Again, we can talk about errors and slow connection only when JavaScript tracking is possible.
  5. Saying that all web analytics tools face the same limitations is simply not true. JavaScript based web analytics tools do have these limitations, but not log file analyzers.

Pardon me if you don’t care about visitors that block JavaScript or click on a different link before tracking script is loaded, websites that directly link to images on your website, downloads of non-html files (PDF, ZIP, EXE, images, …), bandwidth usage, spiders, bots that pull down the whole website on a regular basis (wasting your bandwidth), direct access to scripts by hackers, etc, etc. Sure, with JavaScript analytics you can see trends and if you only care about marketing it could be good enough, but total number are not even close, and you can forget about other information that can be found in server log files.

MYTH 6: With Google Analytics you can’t control your data

Yes, you can control your data… at some degree. Google promises to resists the urge to analyze your data for own purposes (if you don’t forget to explicitly say so), but the fact is that they already have your data, right there. In this information era knowledge is a big asset. Sorry, but I don’t buy that they won’t ever “peek”, just a little. Probably under the excuse of “serving better search results” (or more likely, “serving better advertisements”). And I’m not talking about analytics only: they have search queries, e-mails, documents, appointments, instant messages, etc. They predicted Eurovision 2009 contest winner based on what people search and I should believe that they won’t silently use all the information they can for profits? Right…

Even if you do trust Google (and every its employee), you still can’t say that you fully control your data as it’s still on their servers. Anything can happen in the future. What if Google goes to bankruptcy? Okay, not likely, but possible. πŸ™‚ Therefore, you can’t fully control your data, but don’t get me wrong: I admit that there are few pros. For example, you don’t need to think about backup – the data is much safer on Google servers than on your computer. πŸ™‚

* * *

Disclaimer: the purpose of this article is not to persuade anyone to use server log analyzer instead of Google Analytics (I wrote another article for this πŸ™‚ ), but to point out few things that are too easily overlooked these days, intentionally or not.

Does world really need so many PIM applications?

Is Sisyphus building another PIM?β€œOpinions are like noses, everyone has one”

When I mention that my company is selling PIM software, I almost can hear β€œOn no! Another one?!”. Other PIM developers surely share similar experience, so let me try to explain what the heck I was (we were?) thinking.

Back in 2002, when our first products just started to pick up and I was getting serious about all this stuff, I needed PIM / to-do list application that I can actually use. After trying out dozen of existing products, I finally came to the conclusion that none of them suits my needs. Some were too complicated for what I need, some were too simple for what I need and some were just too ugly for my taste. πŸ™‚ Of course, I was aware that these products were perfect for some other people, but still, I needed an application that does things differently.

Full of enthusiasm, without hesitation, business plans and similar mumbo-jumbos, I sat down and started a new project: Agenda At Once. Back then I didn’t know what we will do with it: give it away, sell it or just use it internally. I was simply satisfied with a feeling of creating something new and innovative.

After seeing what doesn’t work for me, I had a decent picture of what will. I took these as starting points:

1. My job doesn’t include much fixed appointments – I should probably pay most attention to to-do list management.

2. Of course, time after time I do need to schedule or attend a meeting, so application should have this possibility too.

3. It should be possible to divide tasks into subtasks. Strangely enough, in 2002 not much (if any) PIMs supported this.

4. It should be possibile to enter free-form notes somehow, for any data that doesn’t stricly fall into a “task” category.

5. It should be really easy to use – drag & drop, plenty of keyboard shortcuts, descriptive and simple interface.

It turns out that what worked for me worked for lot of other people too. Soon enough users started suggesting new features and most of them were implemented in all these years. Although number of features and options is multiplied compared to first release, I think we succeeded to maintain almost same level of simplicity and keep original philosophy.

So, what’s the answer? My guess is: yes, as long as at least one developer is motivated enough to create it – there’s a good chance that many non-developers have similar noses opinions.