Category Archive:Linux

Zimbra has been a bit hit.  I’m currently trying to get the software lifecycles synchronized.  Zimbra 7 has just been released.. unfortunately zimbra does not support Debian any longer.  Centos 6 is about to be released and I’m not a fan of Ubuntu.  Centos 5 expires in 2014 which is about the same time as Zimbra 7.  It looks like I’ll stick with Centos 5 until the EOL of Zimbra 7.  Then for Zimbra 8 change both the Zimbra version and the host operating system.

Servers:  One of the donated rackmounts is now running Astaro again.  Untangle let me down when it counted and the conduct of their founder and COO i find distateful.  I ahd a bad e-mail get past the Untangle system and infect one of my users computers.  I’ve since switched to Astaro and frankly I couldn’t be happier.  Not only has the spam detection gone up to near 99% or higher but false positives are nearly zero.  So far the Astaro is rejecting 90% of all spam mail before it even gets to the anti-spam and a/v engines.  This has led to a marked decrease in resource usage by the Zimbra server.  I honestly had no idea how much was getting by the Untangle until i installed Astaro.

I also had all the ups units in the server room fail.  Luckily I was able to get a new single, large ups that’s ultimately capable of running everything in the server room for at least 10 minutes.  Once i get the control software installed the main server will be able to send graceful shutdown signals to the mail server and firewall server if there is a sustained power disruption.  The file server will also shutdown gracefully meaning less chances of file system crashes or corruptions..:)

There’s a couple of large projects coming but i’m not going to talk about them until everything is in place..:)

Got the new server online months ago..sorry for the lack of news.  I wound up sticking with Debian.  Everything went smoothly and now there are several domains running off of this box including multiple streaming servers.  Now a bigger challenge looms…moving this station to a new location AND hooking everything up to it’s new location.  will keep folks posted as I can.

I just did my annual update to the Linux Counter Project which is located at  Once i finished my updates I was quote shocked at what I found.  Out of all the machines I manage in one form or another(at least server wise) more than 90% are Linux boxes.  Some of my clients have two Linux servers.  Desktops are overwhelmingly Windows however.  Out of 13 servers 10 of them run Linux.  That’s quite amazing when you think of it.  I did not have any agenda when doing this..i simply chose what i felt was the best tool for the job.  Of those 10 Linux boxes there’s 4 dedicated firewalls, 1 web hosting server, three file servers, and one dedicated mailserver.  The distributions represented are Astaro(1), Untangle(3), Debian(2), Centos(1)(running the Zimbra Groupware Suite), SME(1), and Zentyal(2), of former e-box fame.  That’s an amazing variety that I was quite surprised to see presented.  Going about my daily business it’s easy to not really realize your layouts sometimes until you do an independent audit like this and then have it stare back at you..:)

This article says it better than i can.  The GPL is actually now causing more issues than it solved.  I would actually use the GPL to RESTRICT who can use my software and how..whereas the BSD license is truly a Free Open Source license.

Internal Email on Why a Software Company Migrates Away from MySQL.

InnoDB Storage Engine Dropped From Oracle MySQL Classic Edition.

Oracle owns Innodb and now MySQL.  It’s time to move from Mysql to postgres.  Not one of the mysql forks..but postgres.  Oracle is beginning the squeeze of Sun’s properties.

I have been doing IT work as a volunteer for a local radio station WTHU for a few years now.  Slowly but surely we have been moving along the technology track int he right direction.  We have a stout server in wash state that handles our streaming.  The costs for this are extremely reasonable but time have gotten tight and we have to find ways to cut costs even more.  That lead to a local vendor, Swift systems, that has kindly donated a 2u rackmount server, colocation,  power, and an unmetered 10 megabit port.  I went into Swift Systems today to install Debian onto said server.  This turned out to be quite the adventure.  Some of it was totally me..i was not familiar with Debian 5.  I’ve used several variants including the near ubiquitous Ubuntu(which I would NEVER put onto a server) but I wanted the real Debian.  The hardware issue is the cd-rom drive.  I don’t know why..but it’s sssssssllllloooowwwwwww.  Painfully slow.  However the rest of the box is very very fast.  Debian in it’s default install mode will only allow you to configure one interface at install time.  If you give it an address that does not have internet connectivity when it tries to build it’s mirror list it’ll timeout(after about 5-10 minutes) and use ONLY the cd-rom.  I found this out the hard way.  I was not going to do that.  I tried a reinstall but again was met with the sloooow cd-rom..:)  I tried to setup one interface via dhcp(so it would get a local ip) and then setup the other interface to static to no avail in the installer.  I setup the ilo with another static ip in the assigned range and will have them rack the box.  I should be able to get into the machine using hte ilo and then using hte console redirect instlal debian to the static range.  I should be able to then build the repos properly and have a working Debian install.

Why not Centos?  Centos 5 is less than 3 years from expiring.  I did not want to have to do an os upgrade anytime soon.  With Cent you have to reinstall for an upgrade.  With debian you just run apt-get and install the new version.  We will see if i can get Debian to install via the ILO.  If not i’ll go with centos and deal with the os upgrade later..:)

Well i got the servers in and really didn’t want to wait for the 12u rack…mainly because it’s not int he budget right now.  I took one of the servers and have installed untangle on it.  I now have 4 network cards in the thing.  One is red(internet), one is blue(free public wifi) and one is green(church’s internal network).  The 4th one is for future use(which I already have  a plan for).  What are the specs of this box?  It is an IBM x335 with dual xeon 2.8ghz cpu’s with HT , 4 gigs of ram, and two 36 gig 10k U320 SCSI hdd’s in hardware raid 1.  The thing just smokes..:)  I’m waiting for a couple of major events to really test the box:

1.  the Don Piper conference we are having

2.  Upward basketball.

Upward is going to be the bigger test as we’ll have hundreds of folks inside the new wing from 9am to 6pm sat and sun every week for about 3 months.  I’m hoping to get at least 20 folks on that so i can see how this box handles it.

I had a Dell Poweredge 1800 running Astaro as the firewall until this donation came in.  Our e-mail is run by a company called powweb and I have been hearing for a long time about unreliable service, crashing interfaces, and other issues for months now.  since the Dell is 64 bit compatible I decided to press that one into use as the new church e-mail server.  The test for the firewall is can it handle everything i’m going to throw at it?  e-mail, content filtering, anti-virus scanning, packet inspection, remote access..etc etc etc.  My research tells me it will.  The most fascinating thing about Untangle is it makes heavy use of Java.  Java is at the core of the entire system and ALL traffic passes through this Java core.  So far it’s worked without a hitch.  I’ve setup some simple traffic priority rules that say the church’s traffic has the highest priority and the free wifi has the lowest.  I’ll be watching the server closely to see how it does..not that I’m anticipating problems..but this is a new product that has impressed me..and i want to see it work under load as i look at the innards to see how it works..:)  Cost for all of this?  105$ and that was just to cover shipping,,:)  All of the software is free.

I just need to get the final list of current mailboxses and get the DNS switched over.  Staff meeting this Monday to see if they’ll give the green light.  I have found several extensions(called zimlets) that really extend the featureset of the Zimbra platform.  I know have built into the platform:

1.  Automatic detection of UPS and FEDEX tracking numbers.  The system will automatically highlight tracking numbers and auto-create hyperlinks.  Clicking the link takes directly to your tracking information

2.  Daily summary of tasks and appointments.  When the user logs in the zimlets checks their calednar for that day and sends them appriate reminders.

3.  Post Office tracking.  Along the same lines as the UPS Fedex trackers…this also will grab post office trackings form several other countries as well.

4.  Social network integration.  Twitter, Facebook and a couple of others can be integrated into your Zimbra interface

These are in addition to the base feature set available with the free version.  All of these zimlets are free as well.  The best thing….no more outlook.  FBC users can get to this anywhere they wish to via a https secured channel..:)

You can read about the donation here.  I have three IBM x335’s on the way with dual p-4 xeon 2.8 ghz cpu’s, dual 36 gig 10k rpm SCSI drives with hardware raid 1, 4 gigs of ram, all the cables needed including ILO, and rails.  All for the cost of shipping.  Why am I posting about it here?  I run the network at my church.  This will be the first time I can start something like this from the ground up and document what I do, how I do it, and what hardware and software I do it with.  I will also be able to show just how much free software can do and still integrate with an established Active Directory layout as well.  It’s something for other potential NPO clients to be able to see what some creative thinking can accomplish for little or no cost…:)  Stay tuned I’ve created a whole new category for this..:)

This is the primary reason Unix folks remove the computer,  make an image for forensics, and then rebuild from a known good source.  Windows folks have yet to figure this one out.  I take the same philosophy towards malware that Unix admins do..nuke the box…because you can’t trust it’s clean once it’s been compromised.

In one incident, a sports bar in Miami was targeted by attackers who used a custom-designed rootkit that installed itself in the machines kernel, making detection particularly difficult. The rootkit had a simple, streamlined design and was found on a server that handled credit card transactions at the bar. It searched for credit card track data, gathered whatever it found and dumped the data to a hidden folder on the machine. The attacker behind the rootkit took the extra step of changing a character in the track data that DLP software looks for in order to identify credit card data as its leaving a network, making the exfiltration invisible to the security system.

via Persistent, Covert Malware Causing Major Damage | threatpost.

Intel will ship x86 android 2.2 this summer – The Inquirer.

Now this would be interesting.  If this is actually true then instead of me having to get a smartphone with the high price of the cell carriers scamming built in I can get a netbook running android….hrmmmm…I like this idea.  If it works out I might just leave the notebook at home when i go out.

The author makes some great points here.  Take a gander.

Not using desktop Linux? You’re wasting your money | Linux – InfoWorld.

Right now cloud computing isn’t a security’s a security nightmare.  Most cloud apps actually require you to download and install an executable file that then connects to the cloud.  The operating system requirements are?  Windows…most of the time.  I would like to see the cloud vendors support a truly web-based Google.  Then you wouldn’t need windows..Linux would work.  Your costs go through the floor.  No high costs for server operating system software…no high costs for desktop operating system software.    There are a couple of gotchas.  One is most applications don’t yet run on a Linux desktop or a true cloud.  Secondly, disallowing access from outside your company.  This isn’t as easy to solve as it seems since it’s a web-based thing..considering the low costs though just get your company a static ip(s) and tell the cloud vendor only those ip(s) are allowed to access that app.  Then you have the best of both a very brief nutshell.  If you are interested in more details let me know.  I might fire up the podcast machine..:)

Windows Server vs. Linux.

There are some serious errors in this..i’ll address them inline.

Text below:

Windows Server vs. Linux

June 8, 2010 —

Which is better? Microsoft Windows Server or open-source Linux?

This debate arouses vehement opinions, but according to one IT consultant who spends a lot of time with both Windows and Linux, it’s a matter of arguing which server OS is the most appropriate in the context of the job that needs to be done, based on factors such as cost, performance, security and application usage.

7 Open Source innovations

“With Linux, the operating system is effectively free,” says Phil Cox, principal consultant with SystemExperts. “With Microsoft, there are licensing fees for any version, so cost is a factor.” And relative to any physical hardware platform, Linux performance appears to be about 25% faster, Cox says.

That’s at a minimum.  It’s often much higher.  Windows server core is an attempt to regain some of that base speed by jettisoning the gui.

Combine that with the flexibility you have to make kernel modifications, something you can’t do with proprietary Windows, and there’s a lot to say about the benefits of open-source Linux. But that’s not the whole story, Cox points out, noting there are some strong arguments to be made on behalf of Windows, particularly for the enterprise.

For instance, because you can make kernel modifications to Linux, the downside of that is “you need a higher level of expertise to keep a production environment going,” Cox says, noting a lot of people build their own packages and since there are variations of Linux, such as SuSE or Debian, special expertise may be needed.

Windows offers appeal in that “it’s a stable platform, though not as flexible,” Cox says. When it comes to application integration, “Windows is easier,” he says.

Windows most assuredly is NOT easier.  by the time you get to managing patches, default configuration tweaking, the layers of security you have to pile on to have a prayer of a chance to NOT get compromised…Linux is MUCH easier.  I can turn up a Linux server from ground zero to the base install in under an hour WITHOUT USING AN IMAGE.  Updates?  One run and one reboot..Windows?  It’ll be multiples of each…it goes on and on and on.

Windows access control “blows Linux out of the water,” he claims. “In a Windows box, you can set access-control mechanisms without a software add-on.”

He apparently hasn’t heard of chmod and chown.  You can do everything you want right from the cli.  I tend to use a package called Webmin which is installed from the command line and run from a web browser…i don’t have to pay the Windows gui performance tax.

Patching is inevitable with either Windows or Linux, and in this arena, Cox says that it’s easier to patch Windows. Microsoft is the only source to issue Windows patches. With Linux, you have to decide whether to go to an open-source entity for patches, for instance the one for OpenSSH, or wait until a commercial Linux provider, such as Red Hat, provides a patch.

OR you can use a community variant called Centos(to reference Redhat) which is non-commercial…OR you can use the granddaddy of Linux distros, Debian, who has the basis of many many other distributions.  You don’t have to go to openssl because the distros are hooked right into the package vendors.  Here’s one point the author missed…speed of patches.  Microsoft WON’T patch until there’s an active exploit outside of it’s monthly cycle.  Most Linux distros patch within 24 hours of release..24 HOURS..not DAYS or MONTHS…HOURS.  Let’s see Microsoft do that…and do it reliably with hosing it’s users systems that have gotten infested due to their continued bad design choices.

Microsoft presents a monolithic single point of contact for business customers, whereas “In Linux, you need to know where to go for what,” which makes it more complicated, Cox says. “There’s no such thing as a TechNet for Linux,” he says. Linux users need to be enthusiastic participants in the sometimes clannish open-source community to get the optimum results.

Oh and Microsofties aren’t clannish?  LOL!  Let me tell you something..if you don’t drink the Microsoft Kool-aid totally you won’t be in the MS forums and MS evangelists me I know about this.

These kind of arguments may indicate why Windows Server continues to have huge appeal in the enterprise setting, though some vertical industries, such as financial firms, have become big-time Linux users.

The only reason Windows keeps hanging around like a fungus is because the third party app vendors have not yet started coding for Linux in large numbers yet…that’s coming.  Once folks can see the advantages to Linux MS will have to tighten up their code or die.

Linux and open-source applications are popular in the Internet-facing extranet of the enterprise, Cox notes. And Linux has become a kind of industrial technology for vendors which use it in a wide range of products and services — for instance Amazon’s EC2 computing environment data centers rely on Xen-based Linux servers.

Know why?  Security is one, reliability is another, patching is stupid easy(run updates on live system. if no kernel updates no reboot all).  Windows hangs around right now because third party vendors aren’t coding…yet. MS right now does have it’s place and i will recommend windows on the back only when it’s truly necessary. The comments on this article do a far better job of eviscerating the author than I do..:)

The times really are getting smaller and smaller.  You really need to have 5 or more machines active.  Running e-mail out of house can be done but it’s not easy as exchange really wants to be the mail hub(which makes since as it IS a full featured mail server).  The issues are the high cost as well as the high system requirements.  You really need a minimum of 8 gigs of ram and you really need true hardware raid 1 or higher.  I have found dual cores to suffice if they are fast enough but quads are so cheap there’s no reason to skimp.  Unfortunately this is another example of Ms products getting very very bloated.

I think for my small clients server 2008 standard or even server 2008 foundation for simple AD and file sharing is going to be the best bet.  If you aren’t tied to the MIcrosoft backed(say folks who run progbrams that require a windows server to share databases) then I have a couple of alternatives:

1. ebox

2. clearos

Both of these are Linux based groupware suites..and you can’t beat hte price…free.  If you aren’t tied to a Microsoft backend and are a small shop there’s no longer any need to spend 2-3k on a ms based server…you can get a $500 server and use one of these packages.  The only additional cost is an installation fee from ECC…that’s it.

If you are tied to a Microsoft backend then SBS may be a good fit for you.  I have been testing using Google apps for business for my own business and personal domains…and it’s worked out well.  With a few addon plugins you can use Mozilla to calendar and check e-mail.  With a few setting changes you can also share calendars between users.  It’s not quite as flexible as Outlook/exchange…yet but Google is constantly putting new features in that means you don’t have to be shackled to the exchange/outlook pair anymore.

Now that there are truly some alternatives it only means good things for my clients as i can now give them the most effective options for their businesses.  SBS 2003 was a great package at a great price…SBS 2008 has gotten really really expensive.  Frankly those consultants that have hooked themselves exclusively to the MS train are doing their clients a grave disservice in my opinion.

Terremark vCloud Express: First month «

Oucies.  That’s horrendously expensive.  I’m assuming he’s getting great service..:)  There are less expensive options out there…like mine..:)

I am not normally a fan of anything gov’t but this time the VA has developed a system for electronic health records that is well-rounded, stable, highly customizable, has tons of features into which more can be added, and is recognized as a great package.  It’s called VistA(visit a..not vista).  I’ll be looking into this as well as the alliance that has formed around it for my medical practitioner clients.  Full article follows:

Code RedHow software companies could screw up Obama’s health care reform.

via Code Red – Phillip Longman.

The central contention of Barack Obama’s vision for health care reform is straightforward: that our health care system today is so wasteful and poorly organized that it is possible to lower costs, expand access, and raise quality all at the same time—and even have money left over at the end to help pay for other major programs, from bank bailouts to high-speed rail.

It might sound implausible, but the math adds up. America spends nearly twice as much per person as other developed countries for health outcomes that are no better. As White House budget director Peter Orszag has repeatedly pointed out, the cost of health care has become so gigantic that pushing down its growth rate by just 1.5 percentage points per year would free up more than $2 trillion over the next decade.

The White House also has a reasonably accurate fix on what drives these excessive costs: the American health care system is rife with overtreatment. Studies by Dartmouth’s Atlas of Health Care project show that as much as thirty cents of every dollar in health care spending goes to drugs and procedures whose efficacy is unproven, and the system contains few incentives for doctors to hew to treatments that have been proven to be effective. The system is also highly fragmented. Three-quarters of Medicare spending goes to patients with five or more chronic conditions who see an annual average of fourteen different physicians, most of whom seldom talk to each other. This fragmentation leads to uncoordinated care, and is one of the reasons why costly and often deadly medical errors occur so frequently.

Almost all experts agree that in order to begin to deal with these problems, the health care industry must step into the twenty-first century and become computerized. Astonishingly, twenty years after the digital revolution, only 1.5 percent of hospitals have integrated IT systems today—and half of those are government hospitals. Digitizing the nation’s medical system would not only improve patient safety through better-coordinated care, but would also allow health professionals to practice more scientifically driven medicine, as researchers acquire the ability to mine data from millions of computerized records about what actually works.

It would seem heartening, then, that the stimulus bill President Obama signed in February contains a whopping $20 billion to help hospitals buy and implement health IT systems. But the devil, as usual, is in the details. As anybody who’s lived through an IT upgrade at the office can attest, it’s difficult in the best of circumstances. If it’s done wrong, buggy and inadequate software can paralyze an institution.

Consider this tale of two hospitals that have made the digital transition. The first is Midland Memorial Hospital, a 371-bed, three-campus community hospital in southern Texas. Just a few years ago, Midland Memorial, like the overwhelming majority of American hospitals, was totally dependent on paper records. Nurses struggled to decipher doctors’ scribbled orders and hunt down patients’ charts, which were shuttled from floor to floor in pneumatic tubes and occasionally disappeared into the ether. The professionals involved in patient care had difficulty keeping up with new clinical guidelines and coordinating treatment. In the normal confusion of day-to-day practice, medical errors were a constant danger.

This all changed in 2007 when Midland completed the installation of a health IT system. For the first time, all the different doctors involved in a patient’s care could work from the same chart, using electronic medical records, which drew data together in one place, ensuring that the information was not lost or garbled. The new system had dramatic effects. For instance, it prompted doctors to follow guidelines for preventing infection when dressing wounds or inserting IVs, which in turn caused infection rates to fall by 88 percent. The number of medical errors and deaths also dropped. David Whiles, director of information services for Midland, reports that the new health IT system was so well designed and easy to use that it took less than two hours for most users to get the hang of it. “Today it’s just part of the culture,” he says. “It would be impossible to remove it.”

Things did not go so smoothly at Children’s Hospital of Pittsburgh, which installed a computerized health system in 2002. Rather than a godsend, the new system turned out to be a disaster, largely because it made it harder for the doctors and nurses to do their jobs in emergency situations. The computer interface, for example, forced doctors to click a mouse ten times to make a simple order. Even when everything worked, a process that once took seconds now took minutes—an enormous difference in an emergency-room environment. The slowdown meant that two doctors were needed to attend to a child in extremis, one to deliver care and the other to work the computer. Nurses also spent less time with patients and more time staring at computer screens. In an emergency, they couldn’t just grab a medication from a nearby dispensary as before—now they had to follow the cumbersome protocols demanded by the computer system. According to a study conducted by the hospital and published in the journal Pediatrics, mortality rates for one vulnerable patient population—those brought by emergency transport from other facilities—more than doubled, from 2.8 percent before the installation to almost 6.6 percent afterward.

Why did similar attempts to bring health care into the twenty-first century lead to triumph at Midland but tragedy at Children’s? While many factors were no doubt at work, among the most crucial was a difference in the software installed by the two institutions. The system that Midland adopted is based on software originally written by doctors for doctors at the Veterans Health Administration, and it is what’s called “open source,” meaning the code can be read and modified by anyone and is freely available in the public domain rather than copyrighted by a corporation. For nearly thirty years, the VA software’s code has been continuously improved by a large and ever-growing community of collaborating, computer-minded health care professionals, at first within the VA and later at medical institutions around the world. Because the program is open source, many minds over the years have had the chance to spot bugs and make improvements. By the time Midland installed it, the core software had been road-tested at hundred of different hospitals, clinics, and nursing homes by hundreds of thousands of health care professionals.

The software Children’s Hospital installed, by contrast, was the product of a private company called Cerner Corporation. It was designed by software engineers using locked, proprietary code that medical professionals were barred from seeing, let alone modifying. Unless they could persuade the vendor to do the work, they could no more adjust it than a Microsoft Office user can fine-tune Microsoft Word. While a few large institutions have managed to make meaningful use of proprietary programs, these systems have just as often led to gigantic cost overruns and sometimes life-threatening failures. Among the most notorious examples is Cedars-Sinai Medical Center, in Los Angeles, which in 2003 tore out a “state-of-the-art” $34 million proprietary system after doctors rebelled and refused to use it. And because proprietary systems aren’t necessarily able to work with similar systems designed by other companies, the software has also slowed what should be one of the great benefits of digitized medicine: the development of a truly integrated digital infrastructure allowing doctors to coordinate patient care across institutions and supply researchers with vast pools of data, which they could use to study outcomes and develop better protocols.

Unfortunately, the way things are headed, our nation’s health care system will look a lot more like Children’s and Cedars-Sinai than Midland. In the haste of Obama’s first 100 days, the administration and Congress crafted the stimulus bill in a way that disadvantages open-source vendors, who are upstarts in the commercial market. At the same time, it favors the larger, more established proprietary vendors, who lobbied to get the $20 billion in the bill. As a result, the government’s investment in health IT is unlikely to deliver the quality and cost benefits the Obama administration hopes for, and is quite likely to infuriate the medical community. Frustrated doctors will give their patients an earful about how the crashing taxpayer-financed software they are forced to use wastes money, causes two-hour waits for eight-minute appointments, and constrains treatment options.

Done right, digitized health care could help save the nation from insolvency while improving and extending millions of lives at the same time. Done wrong, it could reconfirm Americans’ deepest suspicions of government and set back the cause of health care reform for yet another generation.
O pen-source software has no universally recognized definition. But in general, the term means that the code is not secret, can be utilized or modified by anyone, and is usually developed collaboratively by the software’s users, not unlike the way Wikipedia entries are written and continuously edited by readers. Once the province of geeky software aficionados, open-source software is quickly becoming mainstream. Windows has an increasingly popular open-source competitor in the Linux operating system. A free program called Apache now dominates the market for Internet servers. The trend is so powerful that IBM has abandoned its propriety software business model entirely, and now gives its programs away for free while offering support, maintenance, and customization of open-source programs, increasingly including many with health care applications. Apple now shares enough of its code that we see an explosion of homemade “applets” for the iPhone—each of which makes the iPhone more useful to more people, increasing Apple’s base of potential customers.

If this is the future of computing as a whole, why should U.S. health IT be an exception? Indeed, given the scientific and ethical complexities of medicine, it is hard to think of any other realm where a commitment to transparency and collaboration in information technology is more appropriate. And, in fact, the largest and most successful example of digital medicine is an open-source program called VistA, the one Midland chose.

VistA was born in the 1970s out of an underground movement within the Veterans Health Administration known as the “Hard Hats.” The group was made up of VA doctors, nurses, and administrators around the country who had become frustrated with the combination of heavy caseloads and poor record keeping at the institution. Some of them figured that then-new personal and mini computers could be the solution. The VA doctors pioneered the nation’s first functioning electronic medical record system, and began collaborating with computer programmers to develop other health IT applications, such as systems that gave doctors online advice in making diagnoses and settling on treatments.

The key advantages of this collaborative approach were both technical and personal. For one, it allowed medical professionals to innovate and learn from each other in tailoring programs to meet their own needs. And by involving medical professionals in the development and application of information technology, it achieved widespread buy-in of digitized medicine at the VA, which has often proven to be a big problem when propriety systems are imposed on doctors elsewhere.

This open approach allowed almost anyone with a good idea at the VA to innovate. In 1992, Sue Kinnick, a nurse at the Topeka, Kansas, VA hospital, was returning a rental car and saw the use of a bar-code scanner for the first time. An agent used a wand to scan her car and her rental agreement, and then quickly sent her on her way. A light went off in Kinnick’s head. “If they can do this with cars, we can do this with medicine,” she later told an interviewer. With the help of other tech-savvy VA employees, Kinnick wrote software, using the Hard Hats’ public domain code, that put the new scanner technology to a new and vital use: preventing errors in dispensing medicine. Under Kinnick’s direction, patients and nurses were each given bar-coded wristbands, and all medications were bar-coded as well. Then nurses were given wands, which they used to scan themselves, the patient, and the medication bottle before dispensing drugs. This helped prevent four of the most common dispensing errors: wrong med, wrong dose, wrong time, and wrong patient. The system, which has been adopted by all veterans hospitals and clinics and continuously improved by users, has cut the number of dispensing errors in half at some facilities and saved thousands of lives.

At first, the efforts of enterprising open-source innovators like Kinnick brought specific benefits to the VA system, such as fewer medical errors and reduced patient wait times through better scheduling. It also allowed doctors to see more patients, since they were spending less time chasing down paper records. But eventually, the open-source technology changed the way VA doctors practiced medicine in bigger ways. By mining the VA’s huge resource of digitized medical records, researchers could look back at which drugs, devices, and procedures were working and which were not. This was a huge leap forward in a profession where there is still a stunning lack of research data about the effectiveness of even the most common medical procedures. Using VistA to examine 12,000 medical records, VA researchers were able to see how diabetics were treated by different VA doctors, and by different VA hospitals and clinics, and how they fared under the different circumstances. Those findings could in turn be communicated back to doctors in clinical guidelines delivered by the VistA system. In the 1990s, the VA began using the same information technology to see which surgical teams or hospital managers were underperforming, and which deserved rewards for exceeding benchmarks of quality and safety.

Thanks to all this effective use of information technology, the VA emerged in this decade as the bright star of the American health system in the eyes of most health-quality experts. True, one still reads stories in the papers about breakdowns in care at some VA hospitals. That is evidence that the VA is far from perfect—but also that its information system is good at spotting problems. Whatever its weaknesses, the VA has been shown in study after study to be providing the highest-quality medical care in America by such metrics as patient safety, patient satisfaction, and the observance of proven clinical protocols, even while reducing the cost per patient.

Following the organization’s success, a growing number of other government-run hospitals and clinics have started adapting VistA to their own uses. This includes public hospitals in Hawaii and West Virginia, as well as all the hospitals run by the Indian Health Service. The VA’s evolving code also has been adapted by providers in many other countries, including Germany, Finland, Malaysia, Brazil, India, and, most recently, Jordan. To date, more than eighty-five countries have sent delegations to study how the VA uses the program, with four to five more coming every week.
P roprietary systems, by contrast, have gotten a cool reception. Although health IT companies have been trying to convince hospitals and clinics to buy their integrated patient-record software for more than fifteen years, only a tiny fraction have installed such systems. Part of the problem is our screwed-up insurance reimbursement system, which essentially rewards health care providers for performing more and more expensive procedures rather than improving patients’ welfare. This leaves few institutions that are not government run with much of a business case for investing in health IT; using digitized records to keep patients healthier over the long term doesn’t help the bottom line.

But another big part of the problem is that proprietary systems have earned a bad reputation in the medical community for the simple reason that they often don’t work very well. The programs are written by software developers who are far removed from the realities of practicing medicine. The result is systems which tend to create, rather than prevent, medical errors once they’re in the hands of harried health care professionals. The Joint Commission, which accredits hospitals for safety, recently issued an unprecedented warning that computer technology is now implicated in an incredible 25 percent of all reported medication errors. Perversely, license agreements usually bar users of proprietary health IT systems from reporting dangerous bugs to other health care facilities. In open-source systems, users learn from each other’s mistakes; in proprietary ones, they’re not even allowed to mention them.

If proprietary health IT systems are widely adopted, even more drawbacks will come sharply into focus. The greatest benefits of health IT—and ones the Obama administration is counting on—come from the opportunities that are created when different hospitals and clinics are able to share records and stores of data with each other. Hospitals within the digitized VA system are able to deliver more services for less mostly because their digital records allow doctors and clinics to better coordinate complex treatment regimens. Electronic medical records also produce a large collection of digitized data that can be easily mined by managers and researchers (without their having access to the patients’ identities, which are privacy protected) to discover what drugs, procedures, and devices work and which are ineffective or even dangerous. For example, the first red flags about Vioxx, an arthritis medication that is now known to cause heart attacks, were raised by the VA and large private HMOs, which unearthed the link by mining their electronic records. Similarly, the IT system at the Mayo Clinic (an open-source one, incidentally) allows doctors to personalize care by mining records of specific patient populations. A doctor treating a patient for cancer, for instance, can query the treatment outcomes of hundreds of other patients who had tumors in the same area and were of similar age and family backgrounds, increasing odds that they choose the most effective therapy.

But in order for data mining to work, the data has to offer a complete picture of the care patients have gotten from all the various specialists involved in their treatment over a period of time. Otherwise it’s difficult to identify meaningful patterns or sort out confounding factors. With proprietary systems, the data is locked away in what programmers call “black boxes,” and cannot be shared across hospitals and clinics. (This is partly by design; it’s difficult for doctors to switch IT providers if they can’t extract patient data.) Unless patients get all their care in one facility or system, the result is a patchwork of digital records that are of little or no use to researchers. Significantly, since proprietary systems can’t speak to each other, they also offer few advantages over paper records when it comes to coordinating care across facilities. Patients might as well be schlepping around file folders full of handwritten charts.

Of course, not all proprietary systems are equally bad. A program offered by Epic Systems Corporation of Wisconsin rivals VistA in terms of features and functionality. When it comes to cost, however, open source wins hands down, thanks to no or low licensing costs. According to Dr. Scott Shreeve, who is involved in the VistA installations in West Virginia and elsewhere, installing a proprietary system like Epic costs ten times as much as VistA and takes at least three times as long—and that’s if everything goes smoothly, which is often not the case. In 2004, Sutter Health committed $154 million to implementing electronic medical records in all the twenty-seven hospitals it operated in Northern California using Epic software. The project was supposed to be finished by 2006, but things didn’t work out as planned. Sutter pulled the plug on the project in May of this year, having completed only one installation and facing remaining cost estimates of $1 billion for finishing the project. In a letter to employees, Sutter executives explained that they could no long afford to fund employee pensions and also continue with the Epic buildout.
U nfortunately, billions of taxpayers’ dollars are about to be poured into expensive, inadequate proprietary software, thanks to a provision in the stimulus package. The bill offers medical facilities as much as $64,000 per physician if they make “meaningful use” of “certified” health IT in the next year and a half, and punishes them with cuts to their Medicare reimbursements if they don’t do so by 2015. Obviously, doctors and health administrators are under pressure to act soon. But what is the meaning of “meaningful use”? And who determines which products qualify? These questions are currently the subject of bitter political wrangling.

Vendors of proprietary health IT have a powerful lobby, headed by the Healthcare Information and Management Systems Society, a group with deep ties to the Obama administration. (The chairman of HIMSS, Blackford Middleton, is an adviser to Obama’s health care team and was instrumental in getting money for health IT into the stimulus bill.) The group is not openly against open source, but last year when Rep. Pete Stark of California introduced a bill to create a low-cost, open-source health IT system for all medical providers through the Department of Health and Human Services, HIMSS used its influence to smash the legislation. The group is now deploying its lobbying clout to persuade regulators to define “meaningful use” so that only software approved by an allied group, the Certification Commission for Healthcare Information Technology, qualifies. Not only are CCHIT’s standards notoriously lax, the group is also largely funded and staffed by the very industry whose products it is supposed to certify. Giving it the authority over the field of health IT is like letting a group controlled by Big Pharma determine which drugs are safe for the market.

Even if the proprietary health IT lobby loses the battle to make CCHIT the official standard, the promise of open-source health IT is still in jeopardy. One big reason is the far greater marketing power that the big, established proprietary venders can bring to bear compared to their open-source counterparts, who are smaller and newer on the scene. A group of proprietary industry heavyweights, including Microsoft, Intel, Cisco, and Allscripts, is sponsoring the Electronic Health Record Stimulus Tour, which sends teams of traveling sales representatives to tell local doctors how they can receive tens of thousands of dollars in stimulus money by buying their products—provided that they “act now.” For those medical professionals who can’t make the show personally, helpful webcasts are available. The tour is a variation on a tried-and-true strategy: when physicians are presented with samples of pricey new name-brand substitutes for equally good generic drugs, time and again they start prescribing the more expensive medicine. And they are likely to be even more suggestible when it comes to software because most don’t know enough about computing to evaluate vendors’ claims skeptically.

What can be done to counter this marketing offensive and keep proprietary companies from locking up the health care IT market? The best and simplest answer is to take the stimulus money off the table, at least for the time being. Rather than shoveling $20 billion into software that doesn’t deliver on the promise of digital medicine, the government should put a hold on that money pending the results of a federal interagency study that will be looking into the potential of open-source health IT and will deliver its findings by October 2010.

As it happens, that study is also part of the stimulus bill. The language for it was inserted by West Virginia Senator Jay Rockefeller, who has also introduced legislation that would help put open-source health IT on equal footing with the likes of Allscripts and Microsoft. Building on the systems developed by the VA and Indian Health Services, Rockefeller’s bill would create an open-source government-sponsored “public utility” that would distribute VistA-like software, along with grants to pay for installation and maintenance. The agency would also be charged with developing quality standards for open-source health IT and guidelines for interoperability. This would give us the low-cost, high-quality, fully integrated and proven health IT infrastructure we need in order to have any hope of getting truly better health care.

Delaying the spending of that $20 billion would undoubtedly infuriate makers of proprietary health software. But it would be welcomed by health care providers who have long resisted—partly for good reason—buying that industry’s product. Pushing them to do so quickly via the stimulus bill amounts to a giant taxpayer bailout of health IT companies whose business model has never really worked. That wouldn’t just be a horrendous waste of public funds; it would also lock the health care industry into software that doesn’t do the job and would be even more expensive to get rid of later.

As the administration and Congress struggle to pass a health care reform bill, questions about which software is best may seem relatively unimportant—the kind of thing you let the “tech guys” figure out. But the truth is that this bit of fine print will determine the success or failure of the whole health care reform enterprise. So it’s worth taking the time to get the details right.

It is going to be either Astaro or Untangle.  Depending on what the clients needs are.  Ipfire feels kludgey and ipcop isn’t really designed for modern hardware and is a bit too basic for my needs.  Comixwall is a typical Debian distro..if you want to live in the cli go this route..:)

According to the Centos homepage:

The CentOS Development team had a routine meeting today with Lance Davis in attendance. During the meeting a majority of issues were resolved immediately and a working agreement was reached with deadlines for remaining unresolved issues. There should be no impact to any CentOS users going forward.

The CentOS project is now in control of the and domains and owns all trademarks, materials, and artwork in the CentOS distributions.

We look forward to working with Lance to quickly complete all the agreed upon issues.

More information will follow soon.

Original Letter

Last Update: August 1, 2009 04:34 UTC by Donavan Nelson

This is nice but it does not answer the questions they raised in public.  This also does not tell me they are going to be more transparent which could lead to another similar issue in the future.  Centos,  What has actually been done to resolve these issues?

Right now I still cannot put my full trust in the long term stability or viability of this project.

For the clients running centos I am going to be researching other alternatives.  Right now no servers are in danger of being unable to update.  I will keep everyone informed as to how this situation unfolds.

Read this site:

Planet CentOS.

It turns out the CentOS project is under the control of one person and that person has decided to disappear..for over a year.  All monies that got donated did not go to CentOS but to the founding individual.  This type of thing can happen anywhere but this type of thing is what gives anti-open source folks tons of ammunition.  They may have to rename the project or merge with another one.  I will be watching developments as they unfold.  I personally am now researching other distros to migrate to since I can no longer be assured of the stability or longevity of CentOS.


I have been following the mailing list as well you can find the mailing list entries here.

*UPDATe 2*

the following was posted on the sidebar of the centos homepage:

  • CentOS is not dead or going away. The signers of the Open Letter are fully committed to continue the CentOS Project. Updates and new releases will continue.
  • The issues raised in the Open Letter have been raised privately literally for years and a voluntary resolution had been hoped for and worked toward. But progress requires follow through. We have tried contacting Lance in private for a long period of time before this Open Letter. While we received promises, there was no real response or follow through from him on promises made. We are sure he is not dead, on vacation, or sick. Once we all decided there was no movement in the matter we created the Open Letter. This is not something that appeared just recently.
  • We would really like to continue the project using the domain. That is one of the reasons for the Open Letter. But the developers will move to another domain if there is no other option. Protective backups are in place; hot machines exist to allow for a cutover with a simple one time installation of one RPM package. We continue to refine our plans if this might be the case, to make the transition as smooth as possible.
  • We thank the people who have stepped forward and want to donate to the CentOS project to hold off for now until issues surrounding the domain and donation policy are resolved. Selected donations will be privately solicited by the signers of the Open Letter on some transition matters. We will post general instructions on how you can help the project as matters become resolved.
  • The CentOS project is run completely by volunteers and we are aware that this requires a different management style. We have been and continue to work to prevent issues like these from occurring in the future. We will continue this effort in the future, but the matters mentioned in the Open Letter prevent us from moving forward at this moment, as they need to be resolved first.
  • Security issues with sudo « Mihai’s Weblog.

    I have always thought Ubuntu’s way of locking out direct root access was nonsensical.  It now turns out it worse than’s Microsoft-ish.

    There are so many out there it’s hard to sift through.  I have narrowed things down to 4:

    1.  Ipfire

    2. Astaro

    3. Untangle

    4.  Comixwall

    Depending on capabilities and budget those are the 4 ECC is going to be choosing from for the foreseeable future.  This makes it easier for my clients to know what i am offering and makes it easier for me since i’m not suporting 10 different server packages.  I intend on narrowing it down to 2 in the near future.

    ComixWall ISG – Home.

    Now this is interesting.  Comixwall looks like what i ahve been looking for.  I have been wanting to find a UTM that takes what i had with ipcop/zerina/cop+/copfilter and have it integrated into one package…and be free.  Astaro is very nice but it’s free only for home use.  Untangle is nice but it’s feature set is compromised by their wanting to complete with astaro so the free content filter and free anti-spam are compromised in their effectiveness unless yiou go commerical.  Comixwall is totally free and so far looks promising..more to come.

    linux_timeline.png (PNG Image, 2888×2079 pixels).

    I found this interesting…enjoy.

    Startup Founders Turn Android into Desktop OS – PC World.

    Now this is itneresting.  Read on for full details.

    BS’ Blog:

    Bryan J Smith devles into the MS file systems and compares then against Linux/Unix file sytems and digs out why MS needs defragmenting but Linux does not.
    *UPDATE* Bryan has updaed the post with more XFS and clarifications