Sunday, October 7, 2012

Open-access deal for particle physics : Nature News & Comment

This is good news, but it still doesn’t make me want to be a particle physicist.

After six years of negotiation, the Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is now close to ensuring that nearly all particle-physics articles — about 7,000 publications last year — are made immediately free on journal websites. Upfront payments from libraries will fund the access.

Open-access deal for particle physics : Nature News & Comment

Google Spanner: I told you so.

The pendulum swings back (toward relational-style features).

Some authors have claimed that general two-phase commit is too expensive to support, because of the performance or availability problems that it brings [9, 10, 19]. We believe it is better to have application programmers deal with performance  problems due to overuse of transactions as bottlenecks arise, rather than always coding around the lack of transactions.

You can read the technical paper here.

Research fraud exploded over the last decade | Ars Technica

Sigh.  Information quality and information responsibility in the news.

A number of studies have spotted a worrisome trend: although the number of scientific journals and articles published is increasing each year, the rate of papers being retracted as invalid is increasing even faster. Some of these are being retracted due to obvious ethical lapses—fraudulent data or plagiarism—but some past studies have suggested errors and technical problems were the cause of the majority of problems.

A new analysis, released by PNAS, shows this rosy picture probably isn't true. Researchers like to portray their retractions as being the result of errors, but a lot of these same papers turn out to be fraudulent when fully investigated.

Research fraud exploded over the last decade | Ars Technica

Twitter, PayPal reveal database performance - Software - Technology - News - iTnews.com.au

This is why the definition of Big Data as “data too big for traditional technologies like RDBMS” is a Big Load.

Jeremy Cole, database administration team manager at Twitter, told attendees that the micro-blogging network uses a commercial instance of MySQL because there are "some features we desperately need to manage the scale we have and to respond to problems in production".

Cole revealed that Twitter's MySQL database handles some huge numbers — three million new rows per day, the storage of 400 million tweets per day replicated four times over — but it is managed by a team of only six full-time administrators and a sole MySQL developer.

Twitter, PayPal reveal database performance - Software - Technology - News - iTnews.com.au

E-Health Insider :: NHS staff should code - Kelsey

Uh oh.  This reminds me of the “democratization of data” phenomenon of the late 80’s, when “power users” of tools like FoxPro, DBase, and Lotus 1-2-3 went to town making departmental applications.  The effects on enterprise data quality were disastrous then and there’s no reason for optimism now.

Tim Kelsey, the NHS Commissioning Board’s first national director of patients and information, is to encourage doctors and nurses and other front-line staff to learn how to program.

The new NHS information chief, and former Cabinet Office transparency tsar, says encouraging NHS staff to code will give them with the skills to work with data and help unleash a powerful and disruptive wave of innovation.

E-Health Insider :: NHS staff should code - Kelsey