I break with my no-underlying-theme theme and do an all-technology tab sweep; in fact, almost all XML.

Hey, check it out: egaugnaL pukraM elbisnetxE! I think logfiles ought to be line-oriented so you can use grep on ’em. This is not incompatible with XML unless you’re dogmatic and pedantic.

Real-world WS-*, as in WSIT; Milestone 3 is out. Follow some of Eduardo’s links to find out about people who are making Java play nice with WCF.

Speaking of Web Services, Marc Hadley’s WADL is showing signs of traction; see Automatic Multi Language Program Library Generation for REST APIs.

Moving away from XML, check out the Joyent Accelerator re-pricing. This is higher-end than you’d need for a casual blog, but it’s exactly the kind of thing I’d use if Matt didn’t mind being my ISP as a sideline.

Finally, Graeme Rocher tells us in The Charles Nutter Ruby on Grails story that it’s Grails for teh win and everything else suxx0rz; also, in JRuby, Groovy & Java Integration, that only languages designed for the Java platform can “truly integrate with Java”. Interesting stuff; apparently Grails’ strategy for victory does not include making friends along the way.



Contributions

Comment feed for ongoing:Comments feed

From: Jamie (Mar 14 2007, at 15:39)

>I think logfiles ought to be line-oriented so you can use grep on ’em.

I think logfiles should be line-oriented with consistent & unambiguous separators so you can use awk on them. I'm sure you could futz around to get something similar working with XML logfiles, but even bears of little brain like myself can drive chains of awk, grep, sort & uniq.

[link]

From: Alastair (Mar 14 2007, at 17:49)

"I think logfiles ought to be line-oriented so you can use grep on ’em."

Plus the fact that any log file written in XML written is bloated and could greatly increase the size of the log files. This will in turn increasing the time it takes to write the log file. Extra IO cycles being used up writing tags. For a server environment with hundreds of thousands of interactions I'm not interested in having this happen...

[link]

From: robert (Mar 14 2007, at 19:31)

>> but even bears of little brain like myself can drive chains of awk, grep, sort & uniq.

alas, fewer folks each and every day, in each and every way, understand that unix (and descendants) exists on the proposition that pipe is Your Friend.

[link]

From: Josh Peters (Mar 14 2007, at 21:18)

@robert: unix pipes are all well and good, but XML allows for semantic pipelines which can be even more useful.

Tags add value to data. In my opinion, the more metadata a logfile has in it the better in the long run, as it offers me more options for future parsing. I dearly wish my workplace's web server (Microsoft IIS) used an XML dialect for its logfile since it would make parsing that much easier. the aforementioned unix tools are great but they all force data to be of a certain context (plain text). Administrators have more important things to worry about than whether they took into account the potential extra space in a logfile which could screw up a regular expression.

Why hasn't anyone made awk and sed's XML equivalents? XSL and XPath are close but no cigar.

In the end XML-based tools are always going to be more capable as XSLT could always transform the contents into a more awk-friendly form. The intermediate representation has less importance than the original representation, which should almost always be a dialect of XML.

[link]

From: Graeme Rocher (Mar 15 2007, at 00:06)

Tim, that is a seriously mature comment coming from someone ofyour stature. I'm pretty disappointed actually.

If you actually read the posts properly you would see that in no way was i saying Ruby on anything else "sux0rs" (are you twelve?) and was merely trying to point out Groovy & Grails strengths which clearly had not been well expressed in Charles original post

Cheers

Graeme

[link]

From: Al Lang (Mar 15 2007, at 01:22)

> I think logfiles ought to be line-oriented so you can use grep on ’em.

If you (I'm looking at you, commenters above) actually follow the link and find out about the "log files" that LMX was written to parse, you'll find they are Adium's record of the conversations its user has had. (Adium is a multiprotocol instant messaging client).

That's a *completely* different requirement from "a server environment with hundreds of thousands of interactions".

One more click even gets you to a pretty clear statement of the criteria they used when choosing the format (http://trac.adiumx.com/wiki/LogFormatIdeas). There's a lot more going on there than greppability.

[link]

From: Alastair (Mar 15 2007, at 08:52)

> you'll find they are Adium's record of the conversations

> its user has had. (Adium is a multiprotocol instant

> messaging client).

> That's a *completely* different requirement from "a

> server environment with hundreds of thousands of

> interactions".

So would you prefer a more targeted statement such as "XML log files are useful for light weight logging situations"? XML is not the answer for everything. Sometimes it can be useful for log files (I have used it myself), but it can also be a hindrance for performance.

I agree about XSLT as a grep alternative. We have some applications that output memory leak stacks in an XML format. A simple XSLT was written that enabled developers to query for call stacks based on source files that they had written or were owners of. Extremely useful. Using XML as the log format for the full call trace logging of our application would be extremely inefficient and a waste our limited CPU and IO cycles.

[link]

From: Wes Felter (Mar 15 2007, at 11:58)

I wouldn't say that Joyent repriced so much as added new plans with less service for less money. It's still good, though.

[link]

author · Dad · software · colophon · rights
picture of the day
March 14, 2007
· Technology (85 fragments)
· · Dynamic Languages (45 more)
· · Web (394 fragments)
· · · Services (61 more)
· · XML (135 more)

By .

The opinions expressed here
are my own, and no other party
necessarily agrees with them.

A full disclosure of my
professional interests is
on the author page.