This is the typical result of "enterprise" applications which are more interested in decoupling the logging system from the framework versus picking one that works.
Over-abstraction at work.
There is a "commons logging" for .Net as well which replicates the entire log4net API 1:1. That is where the crazy starts to set in.
That's a reasonable (if knee-jerk) first reaction. But reality is a bit more complicated.
log4j was the first de facto standard (excluding System.out). Making it part of the standard API would have been a no-brainer. Unfortunately, java.util.logging sucks (hard to customize, coupling with J2EE app servers). So people keep using log4j.
Library writers now have a problem: How do you choose which logging framework to use? You really want to use the same framework as your users, the developers making apps. Thus was born commons-logging.
commons-logging has a number of problems, including non-deterministic setup and being bundled with some common app servers (which creates its own share of problems).
Thus: slf4j. Which works.
I agree with the author that slf4j is the way to go. Alternatively you can just use log4j directly. But slf4j has one feature which log4j doesn't: format strings. If you've ever had to add "if (log.isDebuggingEnabled())" checks to logging code you'll appreciate what a difference this makes.
In conclusion: Java logging is a clusterfuck, but if you know what to use you'll be fine. The end.
I believe one of the reasons is that logging is not a simple problem, but it is ubiquitous to all applications.
It is not simple because:
1. your logging is usually a crosscutting concern to applications, and is really fit for Aspect Oriented Programming Mechanisms.[1] In language with mixins, like Ruby and Scala, this is not a big issue, as you don't need full AOP to solve this. AspectJ is not really widespread, and adds quite a lot of complexity to a small project, and is not standard Java, which makes a bad situation worse on Java.
2. your logging must be easy to disable, and only compute what it is actually needed. With languages with closures and Ast Metaprogramming, like Lisp Macros, you can do this very easily. Scala is particularly suitable to this, as you can use call by name[2] semantics to make the closures implicit.
The definition of murder was "manipulated" around the time of the second world war to include the world "unlawful".
Find an old dictionary (circa 1930) and look up the definition of murder. It mentions only a "premediated killing".
This is an unpopular opinion, especially with the "perpetual war" that we have to endure, but if you kill another person regardless of the justification, even if it's your job, or they wronged you, then you are a murderer. That includes the people who perform the executions and all soldiers. Putting a label on it or changing semantics doesn't make it ok.
(This was downvoted immediately obviously by a supporter of murderers)
That is interesting, but "murder" has a clear definition in 2011, and I don't think that using the 1930 definition in 2011 clarifies communication.
Of course, there are valid moral points to be made here, but if you want to make those you should make an ethical argument, not just play semantic games. (Please don't, it's way off topic on HN.)
I won't go further than to say that when semantics are change by politics, we all lose. Legitimising murder in the name of war and justice is a slippery slope.
"An-eye-for-an-eye-for-an-eye-for-an-eye ... ends in making everybody blind." (Ghandi).
It's sometimes necessary to use violence in order to create disincentives for it, and discourage others from engaging in it.
Ghandi is a good guy and all, but he isn't always right:
I would like you to lay down the arms you have as being useless for saving you or humanity. You will invite Herr Hitler and Signor Mussolini to take what they want of the countries you call your possessions...If these gentlemen choose to occupy your homes, you will vacate them. If they do not give you free passage out, you will allow yourselves, man, woman, and child, to be slaughtered, but you will refuse to owe allegiance to them. - Ghandi, 1940
1930 was not the first time murder and killing were made distinct, by a long shot. One ancient example is Mosaic law, which forbids murder, yet commands capital punishment for certain crimes (including murder). Actually, I'd bet that your point of view is a very modern one. Can you cite an ancient source for it?
There is a lot of unjustified killing in the world, I'll grant you that. But all killing is murder? I can't agree. If you see a suicide bomber about to blow himself up in a mall, and you kill him, saving everyone's lives there, I would not call that murder. I would call that heroism.
This was down-voted because you attempted to express your opinion as a fact. If you look at even older religious texts, you'll find that murder is not condoned while capital punishment and war are considered justified.
I think I lean towards your opinion ... capital punishment leaves me with a lot of questions, but I also think there are cases where it's clear-cut. I guess I'm thankful that I'm not the one dripping the poison into prisoner's veins.
Maybe have the monitor on some sort of swivel mount in the bathroom? I think that'd be awesome... but maybe awkward to type on a touch keyboard that big. :)
Like I suspected, this entire thread has been turned by zealots into a Microsoft-bashing exercise.
I genuinely dispair for people who spend their entire time platform bashing and don't add something constructive to the discussion or tar and feather a side religiously. It paints a very bad picture of the "startup culture" amongst more established organisations.
I had one of those FW900's. Unfortunately the other half got fed up with me lugging immense bits of kit around so I reluctantly gave up. Now I've got a standard 23" 1080p TFT :(
Over-abstraction at work.
There is a "commons logging" for .Net as well which replicates the entire log4net API 1:1. That is where the crazy starts to set in.