elvis' big fat Free Software / Linux 101 sticky thread

Discussion in 'Other Operating Systems' started by elvis, Dec 8, 2008.

  1. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    I originally wrote the following articles on another forum. I'm going to stick them all in here in one big thread.

    Some of the information is old now (it was written about 18 months ago), and since writing a lot of things have changed in the Free Software world. Not only in package updates, but also in new stuff appearing. All the same, much of the following is still valid, and at least will give total Linux newbies a heads up at what's out there.

    The posts start with a general "What is free software" Linux 101 style philosophical rambling, but towards the end will list a bunch of applications that people can try, broken down by type (office, audio, graphics, etc).

    The posts may seem a little "broken", as a lot of them are responses to questions which aren't posted here. Use your imagination to fill in the blanks. :)

    Enjoy...
     
  2. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    In it I want to do the following:

    1) Explain what free software and Linux is, and why they may or may not matter to you

    2) Debunk a lot of myths about GNU/Linux, as well as remove the rose coloured glasses and give folks a realistic expectation of what it can and can't do

    3) Give an ever-growing list of end-user software for GNU/Linux that is either identical or functionally equivalent to a Windows/non-free alternative. After all, if an operating system can't run applications that you need it to, it's useless to you.

    Firstly some definitions and FAQs:

    What is Linux?
    "Linux" is a kernel - a small piece of software that talks directly to computer hardware. "Linux" by itself is useless to an end user. It needs some sort of interface to show the user output (graphics, icons, sounds, etc) and some sort of shell/input to recieve instructions (again, icons, mouse/keyboard, etc). Linux is most often paired with the GNU utilities and software such as the graphical desktops GNOME and KDE. The combined name for this entire operating system is GNU/Linux.

    What is free software?

    Some people would say "software that costs nothing". And surprisingly, they'd be wrong. "Free" means two things in the English language.

    From dictionary.com: http://dictionary.reference.com/browse/free

    Summary: free can mean:

    1) Not costing any money

    2) Liberated, or removed from restriction.

    "Free" in "Free Software" means the second. "Free Software" is often sold for money, and indeed in many countries people make a great deal of money selling Free Software to all markets including home users, education, government and the corporate world.

    Why should I care if it's free?

    Good question. Let's first examine the "four freedoms" that Richard Stallman (founding member of the Free Software Foundation) believes every user should have when it comes to software (excuse the nerdiness as he starts counting from "0" like a computer):

    * The freedom to run the program, for any purpose (freedom 0).

    * The freedom to study how the program works, and adapt it to your needs (freedom 1). Access to the source code is a precondition for this.

    * The freedom to redistribute copies so you can help your neighbor (freedom 2).

    * The freedom to improve the program, and release your improvements to the public, so that the whole community benefits (freedom 3). Access to the source code is a precondition for this.

    These freedoms sound pretty common sense. They are freedoms you would demand from anything. But all are in jeopardy. Let me give examples:

    * The freedom to run the program, for any purpose (freedom 0).

    DRM. It's here, and it's bad. So you've bought a movie? And you want to watch that movie on hardware somewhere other than your DVD player or home PC? If you live in the United States, Canada, and shortly Australia, that is a criminal offense. Yes, an offense for running software other than it's intended use as outlined by a foreign company.

    * The freedom to study how the program works, and adapt it to your needs (freedom 1). Access to the source code is a precondition for this.

    You've downloaded a piece of software to play a movie, and you want to look at the source code to make sure it doesn't "phone home" and distribute information about you, or install spyware on your computer. Perfectly valid reasoning, but illegal under the DMCA (Digital Millenium Copyright Act).

    * The freedom to redistribute copies so you can help your neighbor (freedom 2).

    School children learn complex computer programs at school. They go home, and find that they can't afford $1000 for the latest Microsoft office. They copy a friend's CD so that they can learn the package with obviously no commercial gain. They have broken the law.

    * The freedom to improve the program, and release your improvements to the public, so that the whole community benefits (freedom 3). Access to the source code is a precondition for this.

    Crackers have found a new Microsoft Windows vulnerability! It's spreading like wildfire and costing businesses and individuals millions of dollars. You find a patch, and want to upload it to Microsoft so you can help your fellow man. Under the Windows EULA (End User License Agreement) this is not allowed. Under US copyright law, this is illegal, and criminal.

    4 examples spanning a wide range of user types where your freedoms are removed when using non-free software.

    Is GNU/Linux the only free software?
    No, not even close. Free software comes in many shapes and forms. GNU/Linux is just one of many. I concentrate on GNU/Linux because I find it the most flexible for a range of tasks I need to do for both myself and customers, but it is by no means the only, nor the "best" free software.

    I use heaps of freeware already, why is this different?
    "Freeware" is a term that refers to software which is free of cost. This software often is still closed and proprietary, and does not allow you access to change, fix or improve it. Despite the small syntactic difference in names, "freeware" is not "free software".
     
  3. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Now some quick myth debunking.

    Linux is just home hobbyist tinker software and homebrew.

    GNU/Linux is free as in freedom. Anyone is allowed access to the source code. While there are a substantial amount of "home hobbiests" working on GNU/Linux software, the number of large corporate and commercial programmers working on GNU/Linux outweighs them hundreds of times over.

    Anyone who has worked in, with, on or even near a computer in the last 20 years will recognise names like IBM, Hewlet Packard and Novell. All of these companies have heavy IP (Intellectual Property) investments in GNU/Linux, and have provided between them billions of lines of code, all at a very high end of quality. All three companies are responsible for GNU/Linux servers and devices that run the world's banks, hospitals and life support systems, finance and stock market servers, all the way down to office file servers, movie studio render farms, and even average end-user desktops.

    You can't make money from free software!
    Yes you can. I do, for a start. :)

    As mentioned, you can sell free software. That is your freedom. You can sell it for any price you like, to anyone you like, whether you programmed it or not. That is your freedom. Of course, someone else is free to do the same at a lower price than you. :)

    As already mentioned, IBM, HP and Novell all sell free software. They sell it for big bikkies too (try and get a quote from IBM for some Linux servers, but make sure you're sitting down first). Some of these companies sell hardware and software bundled together. Some sell just software. The world's most famous all-Linux company is undoubtedly RedHat, who arguably were the world's first corporate-targetting all-GNU/Linux company:
    http://www.redhat.com/

    Other companies have since surfaced, including SuSE (now owned by Novell), Mandrake/Mandriva, and a small South African mob called Canonical who are picking up speed with their very user-friendly Ubuntu distribution.

    These companies sell not only software, but their service to go with.

    Why the hell would you buy something that's free?
    Again, the "free" means "freedom", not "no cost".

    But, GNU/Linux distributions are usually free of cost to download. Some even will send you a printed CD for no cost, and even pay for the shipping? So why would you buy it? Good question.

    As mentioned, service is the big answer there. Big companies fear being left in the cold. Nobody wants to buy some software and have no-one to train them on how to use it, or fix it when it goes wrong.

    Now, let's go back to our three of our 4 freedoms:
    * The freedom to run the program, for any purpose (freedom 0).
    * The freedom to study how the program works, and adapt it to your needs (freedom 1). Access to the source code is a precondition for this.
    * The freedom to improve the program, and release your improvements to the public, so that the whole community benefits (freedom 3). Access to the source code is a precondition for this.

    Free Software gives businesses the freedom to do all of this. Further more, one of the big perils of business is "vendor lock in". There's nothing worse than signing a 5-year support contract with a company, and then having them treat you like dirt. I've been through it dozens of times: a company buys some non-free software with a support contract, the support people play nice for 12 months but then all of a sudden turn nasty and start charing for every phone call and every email making the cost of support 10 times what was first advertised. In the process, the company who bought the software are stuck as they can't improve the software themselves, and they can't hire someone else to improve it either.

    Free Software gives businesses the freedom to change support personnel. Having the source code ensures that new developers can be called in, and they can fix and adapt the software to the comany's needs.

    Look that's great, but I'm a home user and I don't care about support nor the corporate world - I'm just a home user
    Valid point. But freedom is more than support. With free software you can set up software any way you like, with no need to worry about corporate-style licensing. Want a mail server at home? Sure, you're free to do that without paying for Microsoft Exchange. Want to give your grandma a copy? Sure, you're free to do that without buying another license. Want to have a go at writing a website or program yourself? Sure, you're free to make yourself a web server or write and compile code without needing to buy expensive corporate-focussed software that is way beyond what the home user needs.

    What's a "distro"?
    GNU/Linux is free - free to modify, free to improve, free to redistribute. As such, many companies and individuals roll their own distributions (or "distros") which are just collections of pre-packaged free software. The choice of distro is largely irrelevant - they all have pros and cons and no single distro is "the best". Users are free to pick the one that suits their need.

    Some distros focus on different things: some are designed for desktops, some for servers, some for personal video recorders, some for embedded devices, and some just to be really nerdy and hard to use.

    What's with all these "distros"? Why don't they just make one Linux and be done with it?
    GNU/Linux is about freedom. Freedom of choice is a MASSIVE freedom, and one that everyone deserves.

    The question is: whose distro is the best? Whose distro is the most correct? Well, I know which one *I* like the best, but who's to say that my preference would meet the needs of everyone else?

    Switching to GNU/Linux can be a daunting task for new users, there is no doubt. When I first started using Linux there were one tenth the amount of distros there are today.

    My advice to new-to-Linux users is this: pick a distro with good *COMMUNITY* support. Go to the disto authors website, and look to see if they have forums or mailing lists. Even people like me with 10+ years of Linux experience run into hurdles occasionally. Having a forum with thousands of users all in the same boat helps. Again, going back to the 4 freedoms: an important freedom is the need to help your neighbour. One that resonates loudly through the Linux community.

    I hate using Linux - what are my other options?
    As mentioned, Linux is not the only free software out there. BSD is a popular alternative to Linux, and comes in many flavours. One popular alternative is Apple's MacOSX. While the GUI frontend desktop is not free software, the underlying architecture and kernel are. Plenty of GNU free software makes Macs run today. If you're a Mac user reading this in Safari, you are actually using Apple's modified version of the free "Konqueror" Linux web browser!

    And if you still want to stick with Windows, there is TONNES of free software making it's way to Microsoft's baby. I would hazard a guess that a lot of you are reading this from the Mozilla Firefox web browser. Or perhaps you read your email and RSS feeds in Mozilla Thunderbird. Perhaps you write documents in Open Office. Even this very forum you are using now is built on free software (and another example of free software that somebody paid for, because it was worth the cash).

    Even after all is said and done, if people can't give up Windows, it doesn't mean they are restricted from using free software. Again, free software is about freedom - even if you don't want to run a free OS, you can still run free software. That is your freedom.

    That will do for now. There's a lot of information there to digest. Later, I'll get right into the meat of it and start listing some useful applications that people need in their day to day lives. As mentioned: an OS without Apps is useless. So what do we Linux users actually do on our desktops? Find out next time...
     
    Last edited: Mar 22, 2011
  4. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    *user question about Cedega*

    This is another great example of "Free Software" that costs money.

    Cedega, by TransGaming:
    http://www.transgaming.com

    How do I get Cedega?

    If you want to use Cedega as a point-and-click simple install-and-play option, you must buy it and a subscription/maintenance fee. This is a good idea for most people as (a) it's really cheap and (b) it gives you access to trained staff who are on hand to offer help over email.

    Cedega is open source software, and the source code can be downloaded and compiled without paying the subscription fee. If you know what you are doing, or are willing to try the product without the pay-for support, this is fine. However if you run into problems, you're on your own.

    You can extract the Cedega source from a system called CVS (Concurrent Version System). For the non-programmers, CVS is a "database" style software repository that is popular with projects where a lot of people work on the same code. Anyone who's worked on a simple network knows the dangers of multiple users opening and editing documents at the same time. When documents get saved back to their server, whoever saves last is the one whose version overwrites everyone else's changes. Not good. CVS is a "smart" system that tracks the differences between verious uploaded source code files (even if the same file has been edited in multiple places by the more than one person). It then lets you rebuild a complete source code set by extracting these "diffs" and building a source tree on your local machine.

    What does Cedega do?

    Computer games of old were written for "bare-metal" hardware. Programmers wrote directly to CPU and RAM to manipulate data at a hardware level. As CPUs become more complex and games got larger, this method of programming became inefficient. Plus more and more different CPUs come out every year, and it makes it hard for programmers to keep up. As such, games became written for high-level APIs (Application Programming Interface). A layered approach was taken where a programmer could program for an API, and the API took care of talking to the hardware.

    On Microsoft Windows, common game-focussed APIs include OpenGL and Direct3D. OpenGL (as the name suggests) is open (free software), but Direct3D isn't. A typical DirectX API "stack" in windows would look like this:

    You (the gamer)
    The game software
    Input/Output Devices (screen/speakers/keyboard/mouse/joystick)
    Microsoft DirectInput, DirectSound, DirectDraw
    Microsoft Direct3D
    Microsoft HAL (Hardware Abstraction Layer)
    Microsoft Windows
    Windows kernel + Your video card driver
    Hardware (video card, CPU, RAM, etc).

    Information travels up and down this stack as you play the game. The beauty here of course is that programmers only ever need to write games for DirectX. Likewise, you can easily change your CPU, RAM or Video Card and as long as you have the right driver, don't need to change the game. Some of you reading this now will remember a time when games were written only for particular graphics or sound cards! That certainly was an expensive time to be a PC gamer. :)

    So what does Cedega do? Well, DirectX is closed and proprietary. This makes it a problem for Linux gamers. How do they play their favourite games under their favourite OS without Windows?

    Cedega is a software layer that emulates Windows and DirectX system calls, much like MAME or any other emulator would emulate calls to other phantom hardware and software that doesn't exist, but is emulated and/or simulated at a software level.

    There are many other projects out there that do the same. WINE (amusingly stands for "WINE Is Not an Emulator") is a truly free product, and one that will easily handle most 2D Windows applications (I use it to run ClrMAMEPro - the Windows-only MAME ROM manager - under Linux):
    http://www.winehq.com/

    Another is Crossover, which was actually first written by a Nasa employee in his spare time because Nasa moved their operations to Linux, but at the time there was no viable office package (of course now there are many), and he wanted to use MS Office:
    http://www.codeweavers.com/products/

    (Note that Crossover uses a lot of WINE code - this is the beauty of free software: developers can share and expand on each others code, and even make a living from it if it's good enough).

    Cedega is slightly different. It is one of the first products that, while still doing what WINE and CrossOver do, concentrates heavily on 3D graphics, and in particular games. Lets look at our software stack above, but now with Cedega in the picture:

    You (the gamer)
    The game software
    Input/Output Devices (screen/speakers/keyboard/mouse/joystick)
    Microsoft DirectInput, DirectSound, DirectDraw
    Cedega DirectX <-> SDL converter
    Microsoft Direct3D
    Cedega Direct 3D <-> OpenGL converter
    Linux HAL/SDL/OpenGL
    X-Windows (graphical interface for Linux and UNIX)

    Linux kernel + Your video card driver
    Hardware (video card, CPU, RAM, etc).

    So, a few things have changed. Cedega with the aid of software technologies like SDL (Simple DirectMedia Layer) and OpenGL (Open Graphics Language) take DirectX calls, and turn them into calls for other 3D graphics, sound, input and screen drawing libraries on the fly, as the game plays.
    http://www.libsdl.org/
    http://www.opengl.org/

    Which games does Cedega work with?

    TransGaming maintain a compatibility list here:
    http://transgaming.org/gamesdb/

    There are quite literally THOUSANDS of games that work 100% with Cedega, and hundreds more that are partial or work in development. Transgaming aren't leaving DirectX9 high level shader languages alone either, they have solutions in place for people who have DirectX9 and OpenGL2 cards so that all the DX9 bling and eye candy won't be lost under Linux gaming.

    Browse the database and see if the game you want is listed. Often there will be screenshots of other users actually playing the game native under Linux.

    So, that's TransGaming Cedega in a nutshell. Gaming is probably one of the biggest hurdles for Windows->Linux switchers. I would recommend if you are a hardcore tournament-playing PC gamer, Linux may not be suitable for you as a single system. But don't forget that you can dual-boot! For a long time I used "GNU/Linux for work, Windows for play" as my mantra. Although in recent years my PC gaming as dropped to zero, and so my Windows partition was completely wiped around 4 years ago to make space for work stuff under Linux.

    Casual gamers, TransGamer offers a wonderful solution to play games under GNU/Linux. If you have experience compiling software from CVS, try the download version out. There are plenty of guides on the net on how to download, compile and run it. If you like it and use it regularly, I urge you to buy the subscription. Not only does it support the people who write this code, it also sends a message to game makers that Linux is a viable target platform for games. After all, native Linux gaming is what we GNU/Linux users really want. Even for non-PC-gamers like myself, native GNU/Linux gaming would be the straw that broke the camel's back for many Windows users who are desperate to leave Windows, but see it as their only option for their hobby.

    So that's the gaming topic covered. Later today if I get time I'll get down to the serious business of WORK. (Yes, it's a dirty four letter word, but some of us have to do it sometimes).
     
    Last edited: Jan 18, 2009
  5. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Installing software on GNU/Linux

    A note before I start on specific software:

    Most Linux distributions include a "package manager". This is a marvelous piece of software that, as the name suggests, manages the various software packages on your computer.

    Think of it like "Windows Update" for your entire PC, not just Windows - it will update and maintain every single piece of software on your system. Not only that, but every package I'm going to mention is included in the package manager and means that if you install it this way, it will self-update as new and improved versions come out over the months/years.

    Windows users are very used to downloading .exe files directly off websites and installing/running them. While this is possible to do with Linux executables (note fore new users: Linux executables do not contain the ".exe" extension) it is not recommended. Instead, I wholly recommend you learn your particular distro's package manager, and how to use it. Not only can you use it to install the 10,000+ different free applications available, but it will keep all installed programs up to date (not just the OS).

    You will occasionally need to venture outside the package manager to install things, but please use this as your last resort. If you use a distro with good support forums, you may find a kind soul or organisation who has set up a "repository" for the software you want, and they have volunteered to maintain it over time. If you can, utilise these as they will give you far less grief over time.

    Ubuntu users, click Applicaions -> Add/Remove to start the graphical package manager up. Search via the provided categories, or use the search tool if you know the exact name of the package. Chapter 3 of the Ubuntu documentation covers this in detail:
    https://help.ubuntu.com/6.10/ubuntu/desktopguide/C/add-applications.html

    For other distro users, seek your distro's documentation to find out where a similar application is.

    I will provide web links for all software I talk about. Please use these for research and information only. If you want to install the software, again I urge you to use your system's package manager, and not download stuff directly from websites for all the reasons I outline above.

    Office Software

    Lets get stuck right into it with office software. Microsoft Office is by far and large the most common corporate-style Office package there is, and a very large reason why users choose Microsoft Windows as their platform of choice. As a user looking to switch to Linux, there's a very good chance you will need to use some sort of office software at some time.

    There's absolutely no shortage of Office software for Linux. Getting stuck straight into the options:

    Open Office
    http://www.openoffice.org


    I'm a big fan of Open Office. It's available for Linux, Windows and MacOSX, so even non-Linux users can benefit from this. I find it a completely comprehensive Office system packed with more features than even Microsoft Office. Many of my Windows-using clients have switched to this and found it to be a great alternative that offers them better workflow and more options than Microsoft Office. Plus upgrades are free, which means no more paying hundreds of dollars per workstation to keep up to date with the latest standards.

    It comes with the following components:

    Open Office Writer:
    http://www.openoffice.org/product/writer.html

    Standard word processor. Has an ENORMOUS range of dictionaries (great if you speak or write multiple languages - I come from a Dutch father and a French mother, so this suits me perfectly). Will read from and write to almost every document standard including Microsoft Word. It can also export directly to PDF without the need for Acrobat software. By default all OO packages use the international standard XML-based document filetype (which takes up a LOT less space than MS Office's filetypes, I might add). This default can be changed if the user you are setting it up for can't understand the difference between filetypes, and needs to interact with all-Microsoft users.

    Open Office Calc:
    http://www.openoffice.org/product/calc.html

    Spreadsheet. Like Writer, it will read from and write to MS Excel documents, and export to PDF.

    Open Office Impress:
    http://www.openoffice.org/product/impress.html

    Read/write PowerPoint presentations. Exports to PDF, and to Macromedia Flash SWF files for use on the web. Great for making a presentation, and then putting it on a website afterwards.

    Open Office Math:
    http://www.openoffice.org/product/math.html

    If you've ever tried to document maths information in a word processor, you'll know how maddening it is to store all the special symbols. OO Math is a dedicated maths tool for doing just this. Great for students, engineers, technicians, or anyone that needs to document maths formulae. As usual, PDF export is there.

    Open Office Draw:
    http://www.openoffice.org/product/draw.html

    Easy to use drawing package that allows you to do things like flowcharts, coversheets, and other useful documents. I've lost count at how many network diagrams I've drawn up in this. And you guessed it, PDF export.

    Open Office Base:
    http://www.openoffice.org/product/base.html

    Database forms frontend. Will tie nicely into MySQL and PostGRESQL database engines. As of Oopen Office 2.0 I believe there is now MS Access compatibility added too, but I have not used it.

    GNOME Office
    http://www.gnome.org/gnome-office/


    IMHO not as fully featured as Open Office, but a much more lightweight alternative for people who either have older/slower machines, or just need a simple set of tools that aren't as heavy as Open Office. Work is being done to port the suite to Windows and Native MacOSX, but Mac/X users can use it now via FINK.

    Unlike Open Office, the following can all be installed separately. No need to download everything if you only want a single part.

    Components include:

    Abi Word:
    http://www.abisource.com/

    Nice quick word processor with all the features you'd expect.

    Gnumeric:
    http://www.gnome.org/projects/gnumeric/

    The original spreadsheet tool for Linux. This has been around a long time, and again is a nice light alternative to Open Office.

    GNOME-DB:
    http://www.gnome-db.org/

    Database frontend to plug into MySQL. I've not used it, so I can't comment.

    KOffice
    http://www.koffice.org/


    The KDE team are never out done. They too have a comprehensive office suite. Like GNOME-Office, there's no compulsion to install all components. Install only what you need if you want to keep a lean system:

    KWord:
    http://www.koffice.org/kword/
    KSpread:
    http://www.koffice.org/kspread/
    KPresenter:
    http://www.koffice.org/kpresenter/
    Kexl
    http://www.koffice.org/kexi/

    Aka Microsoft Word, Excel, PowerPoint and Access. You know them well.

    Kivio:
    http://www.koffice.org/kivio/

    Line/flowchart drawing.

    Karbon:
    http://www.koffice.org/karbon/

    Vector art tool. Similar to Adobe Illustrator, Corel Draw, Xara and Inkscape.

    Krita:
    http://www.koffice.org/krita/

    Basic image editor. Similar to a lightweight Adobe Photoshop or GIMP. Has recently added support for extended colours systems, including 16bpp RGB as well as CMYK.

    KPlato:
    http://www.koffice.org/kplato/

    Resource management, planning and Gantt charting ala Microsoft Project. Excellent tool for planning any multi-person project.

    KFormula:
    http://www.koffice.org/kformula/

    Like Open Office Math, it's a formula writer for the mathsy folks out there.

    That pretty much covers the popular alternatives. There's more out there, but these are the most popular and mature/usable for home users, students, and corporate offices alike.

    All of these systems are free (as in freedom) and free (as in no cost). Don't feel that you need to limit yourself to one system. Try them all out and see which suits your needs the best. As mentioned, I use Open Office primarily, but it lacks a Gantt Charting tool for when I do project work, so I switch to KOffice's KPlato system for that. I'll talk about desktops a little later on, but GNOME users can happily use KDE programs and vice versa. Don't be afraid to try programs from other desktop systems!
     
    Last edited: Dec 10, 2008
  6. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    ReactOS
    http://www.reactos.org/


    ReactOS is an entirely free non-Linux operating system.

    It is designed from the ground up to be a true Windows replacement. It's only in alpha stages at the moment (for the non-programmers: this means it is early in it's development, probably unstable, and not recommended for true production use).

    With original code, as well as some derived from the WINE project, it's goal is to be an equivalent operating system to Windows that allows users to download and run standard 32bit Windows .exe files without any extra work.

    For people who want to escape Windows, but find that their core applications (or close equivalents) are not available on other platforms, this could be the answer for you. I know for a lot of small-business clients of mine, they are desperate to leave Windows but are stuck with it due to programs like MYOB, Quicken, and others being mandatory to their business and the way it interacts with other businesses (accountants in particular seem to be a bunch that refuse to work with you if you don't give them the right filetypes from their in-house software).

    There's still a few years of development left before this gets to where it needs to be, but in the meantime it's available for people to download and test.

    Check the screenshots page:
    http://www.reactos.org/en/screenshots.html

    and you can see a wide variety of Windows programs running natively in this Windows look-a-like OS.
     
    Last edited: Dec 8, 2008
  7. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Graphics Programs

    Before I start, a quick note on graphics image types. There are two major types of graphic data: raster and vector.

    Raster is a map of bits (a "bitmap") where each pixel (dot) is one piece of information, stored digitally as a colour. Raster images are great for storing "noisey" data such as photographs and other images, but scale poorly (they get blocky when you scale them higher).

    Common raster formats include JPG, PNG, GIF, TIF, BMP, etc.

    Vector is graphic information that is stored mathematically in information such as lines, shapes, colour fill and gradiants. Vector information is preferred in the print industry, as it allows images to scale without losing any quality, and makes prints come out clear and precise without blockiness or artifacting. Vector information works well at storing simple "illustration" style graphics where colours are uniform and/or in even gradients.

    Common vector formats include PDF, EPS, PostScrpt, AI (Adobe Illustrator) and CAD formats like DWG, DXF, etc.

    GIMP (GNU Image Manipulation Program)
    http://www.gimp.org

    Raster editor.

    The grand daddy of non-free image editing is of course Photoshop. It's so popular it's now a verb! ("Hey, did you see that picture where that guy photoshopped a bunny to put a pancake on it's head?").

    GIMP is an alternative package that comes pretty close to providing the same functionality. Available on Windows, Mac and Linux it caters for the needs of most folks easily. For a low-end user like me, I can resize images, save them as different file formats, happily import from formats like PDF or Photoshop PSD, colour correct, change colour channels and other filters, do red-eye reduction, and all the usual tricky tools like layering, clone stamping, cropping and warping, etc, etc.

    GIMP's biggest downfall is that it only supports RGBA (Red Green Blue and optionally Alpha). This is fine for people who only use images for output to TV/monitor (eg: websites, movies, etc). For print people, the CMYK (Cyan Magenta Yellow blacK) colour space is missing. GIMP devs have been saying it will be added for a while now, but so far no dice.

    GIMP can still happily print to a colour printer of course. I print all of my arcade artwork via GIMP and am happy with it. For professionals who need proper industry-quality colour correction, it unfortunately falls short.

    All in all it's a great package and for 99% of the population who only use 10% of the features anyway, a perfect substitute for Photoshop.

    Inkscape
    http://www.inkscape.org

    Vector editor.

    Similar in style to Adobe Illustrator or Xara Extreme, this is a simple vector editor that's perfect for folks wanting to output clean SVG or PDF information for web use or print.

    It has some very handy tracing software built in to convert Raster to Vector and enabled you to scale to your heart's content. I wrote a thread with some examples here:
    http://www.aussiearcade.com.au/showthread.php?t=1858

    Allows drawing of simple line objects, or complex gradients and transparencies. I use Inkscape frequently for drawing my own arcade sideart and marquees, and the results are quite nice.

    Inkscape imports Adobe Illustrator files, and freely outputs to all filetypes including PDF, EPS, SVG, etc.

    Scribus
    http://www.scribus.net/

    Page Layout

    Akin to Microsoft Publisher, Scribus is a free page layout and desktop publishing application. Handy for all sorts of weird documents, like resturant menus, kids school assignments, newsletters, multi-coloumned text, etc, etc. Outputs to PDF which makes mass printing easy.

    Open Office Draw
    http://www.openoffice.org/product/draw.html

    Karbon
    http://www.koffice.org/karbon/

    Krita
    http://www.koffice.org/krita/


    Already mentioned above in the "office" section, so I won't go into detail here again. These cover the basic needs of most non-graphic-artist users who need simple image editing and drawing functions to add to their office documents. Has recently added support for extended colours systems, including 16bpp RGB as well as CMYK.

    Blender3D
    http://www.blender.org/

    3D content creation, editing and rendering

    Similar to 3D Studio Max, Maya and other 3D systems, this allows creation, rendering and animation of anything, limited only by the user's skill. It's been used in some big-budget movies in hollywood, and is picking up steam in the 3D community.

    Creation and rendering tools like this cost literally thousands of dollars per single license. Blender is free (as in freedom and as in free beer), and just as powerful.

    As usual, Windows, Mac and Linux all supported.

    QCad
    http://www.qcad.org/qcad.html

    2D Computer Aided Drafting/Drawing

    Simple 2D CAD package similar in functionality to the early days of AutoCAD (before they went all 3D). I use QCad entirely for all of my arcade machine design and builds, arcade joystick designs, laser cutting, etc, etc.

    It works only in DXF format (an older format supported by most CAD programs on the market), and is a suitable format for most engineering firms and laser cutters if you want to provide them with a digital file to work from.

    Most Linux distros will include QCad's "community" version (comes with no commercial support, but is identical to the commercial version). Source is available and will compile cleanly on Mac (I think it's also in FINK). For Windows, you'll need to either compile it up yourself in MinGW, or buy a pre-compiled version. I've been meaning to get off my arse and build a windows version for some friends, but never got around to it. Most of them are happy enough to dual-boot Linux to use CAD anyway.
     
    Last edited: Dec 10, 2008
  8. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Windows and Linux living in harmony in Enterprise

    Intro

    A short essay on the pros and cons of Windows and Linux, and how they can complement each other and compensate for the other's shortfalls.

    This post will be aimed mostly at system administrators, corporate monkeys (like me), and possibly even MCSEs.

    Throughout this post, I ask anyone who's familiar with Windows licensing and pricing to consider the direct cost savings of what I talk about (the licenses themselves), as well as the indirect (remote access, unlimited connections, etc, etc).

    Windows and Linux - their history

    Windows started life as a low-end, single-user desktop system. Over the years, networking and multi-user code was tacked on, and with it came the headache of multi-user system management, file-system security for non-admin users, and a host of other stuff that makes it a rather awful server system to try and implement securely.

    Linux is the polar opposite. It started it's commercial life in large enterprise on whopping big servers as a UNIX replacement. Built from the ground up as a huge, multi-user system it too followed the UNIX tradition of being network-focussed from day one. As addons came along like the GUI, they stayed true to the "the computer is the network" philosophy, but while maintaining it made life difficult for the end user. Even today despite it's leaps and bounds, Linux as a GUI desktop still has it's shortfalls.

    Both OSes have come far, and have a large corporate following. Both have their obvious pros and cons. Luckily they can exist together happily in the same network, and even leverage each other's strengths to reduce their own weaknesses.

    Active Directory and Windows Domains

    So you want to run Windows on desktops in a corporate network, using Domain log-ons? Not unusual by any stretch of the imagination. Windows desktops are fairly cheap, and have a lot of corporate-focused software.

    Windows Servers on the other hand are often an easy target for viruses and worms, as well as cost big bikkies for licensing. For each connecting user, despite already having bought a Windows desktop license, Microsoft charge you yet another license fee. For small business, that can get exy. Consider your average 5-man small business who has to pay $250 per desktop for Windows XP, and $799 for Windows Server. Add another 5 users, and that's another $250 per desktop, and another $600 for a 5-license pack for Windows Server. That ends up costing the business far more in software and licensing than they probably paid in hardware!

    Enter Linux.

    It surprises me just how few MCSE's understand HOW Windows works. They know WHAT to do, but frequently not WHY they are doing it.

    A quick lesson in Windows domains:

    Windows Active Directory and NT domains are nothing magic. They are made up of the following components:

    LDAP: Lightweight Directory Access Protocol. A lightweight database system that is "tree" like in structure, and contains a parent/child relationship system of users, passwords, and other attributes (groups, contact and email addresses, etc, etc).

    LDAP was not invented by Microsoft. LDAP is an open standard implemented by many pieces of software, including OpenLDAP (previously named Slapd):
    http://www.openldap.org/

    Kerberos: An encryption scheme used to wrap LDAP and other data in. Again, an open standard:
    http://web.mit.edu/Kerberos/

    CIFS/SMB/NMB: Common Internet File Services / Server Message Block / Name Message Block. Gobbledygook meaning "how to send files over a network". These are the two primary protocols used by Microsoft networking for file sharing, print sharing, mapped drives, NetBios name resolution, etc. Once again, free (as in freedom) implementations exist in the form of the massively useful and widely used SAMBA:
    http://au.samba.org/

    When a Windows Server isn't a Windows Server

    Now here's the kicker: OpenLDAP + Kerberos + SAMBA = A Windows server replacement? Cost for 1 user? $0. Cost for 1,000,000 users? $0. License free. Quite often Kerberos can be removed from the equation to lower complexity also.

    Where I work right now, we have 1500 users on Windows desktops who all connect to a "Windows Domain". They share files on a "Windows File Server". The tricky part is, none of it is actually Windows. It's all SAMBA and LDAP backend. Adding users by the truckload costs nothing in licensing, as long as Desktop machines have the correct license installed. Very cheap, and very easy to maintain.

    Any Windows server admin worth their salt will tell you backup domain controllers are mandatory: and they are right. No hardware is perfect, and servers do occasionally go boom. Once more, Linux to the rescue. Building a "BDC" costs only the hardware you want to implement it on (less if you use Xen Hypervisor "virtual machines" like we do - more on these later).

    Want the best of both worlds? A Linux/SAMBA box can happily act as a BDC to a real world Windows Server box. Great for people who need a backup, but still want to maintain their Windows boxes in operation. Happy co-existence for all.

    Furthermore Linux makes backup and restore a piece of cake. Unlike Windows, Linux has no registry or hidden information. All information is stored in logical places (/etc contains system-level configuration, /var/lib contains variable libraries, such as the LDAP database). All of these are plain-text files that can be backed up with a simple copy and zip. Then when it comes time to restore, unzip back to their original location, and restart the service. No reinstalls, no license numbers, no headaches. I can have a domain controller built from the ground up and fully functional in 15 minutes. Less if I script it. There's nothing like telling your CEO that even in the event of a "total destruction" fire/cyclone/flood, you can have a site office converted into a working server room in half an hour or less. Disaster recovery under Linux quickly becomes a very tempting alternative to the hair-pulling experience of Windows restores from tape, or the utter shitfit Windows has when changing underlying hardware (more on that later).

    Now a quick "gotcha":

    Samba is currently at version 3. This implements a full WindowsNT4 style domain and all associated goodies. Currently this means no group policy. This is a must for quite a few businesses, and writes off Samba as a valid alternative.

    But have no fear! Samba 4 is now in testing phase, and implements a full and complete Windows Server 2003 (and upcoming Vista Server) Active Directory, complete with built-in LDAP/Kerberos (no need to install and configure a separate system), and complete Windows Server Group Policy objects and control. Release is due probably mid to late this year, and is an exciting prospect for anyone who wants to look at the possibility of implemented a true and up to date "Windows on the desktop, Linux on the server" network.

    More to come....
     
  9. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Windows vs Linux - Hardware support

    There's quite a misnomer that Linux has poor hardware support. This is the sort of thing that gets perpetuated when long-time Windows users run Linux for the first time, and can't find downloadable drivers for their hardware.

    That's usually because there are no downloadable drivers.

    What??? No downloadable drivers? How the hell does Linux work??? Good question.

    Linux as a kernel is called a "Macro Kernel" (compared to the BSD and Hurd Kernels, which are "Micro Kernels"). Linux is an "everything and the kitchen sink" approach to making a computer run.

    When a Linux kernel is made, it contains drivers for all known hardware. These can be compiled into the kernel directly (handy for embedded devices like phones, PDAs, PVRs, etc) or they can be compiled as individual modules. The latter is handy for distributions like Ubuntu, where the person writing the distro doesn't know what hardware the end user will use.

    Distros like Ubuntu, RedHat, SuSE, Debian, etc use the "throw it all at the wall and see what sticks" approach. Each one uses roughly the same Linux kernel, and as the system boots simply tries to load each and every driver. If a driver finds it's corresponding hardware, the driver stays. If not, the driver unloads.

    Now this sounds long and painful, but it's not. On a low-end 1.5GHz desktop, you'd be lucky to see this section of the boot process take longer than 5-10 seconds.

    The disadvantage is that if your hardware is not supported by the Linux kernel, finding a driver and adding it in requires a bit of Linux knowhow. And if you upgrade your kernel later, it can break the driver support, and you'll have to do the process again.

    The advantages of course is that Linux knows about tens of thousands of pieces of hardware. For most users running on common hardware, compatibility with Linux is a no-brainer. Simple install Linux, boot into a clean system, and everything JUST WORKS. No driver installs, no hardware conflicts, nothing. Less time screwing around in hardware manager, and more time doing real work.

    Further more, installing new hardware is painless. Power down, install new hardware, power up. Linux finds it, and loads a driver for it. Done.

    Anyone who is still a fan of Windows' manual driver loading process, try this experiment:

    Take a working Windows system. Either desktop or server, it doesn't matter. Now power down cleanly, and remove the hard disk. Take that hard disk and put it in another machine with completely different hardware (different graphics card, different chipset, different brand of CPU). Power up, and see what happens. At best you'll spend 15-30 minutes reconfiguring devices, loading new drivers manually, etc. At worst, the system will bluescreen and you won't be able to use it. Reinstall time for you.

    Do the same with a working Linux system, and it's a different story. Linux boots, detects the new hardware on the fly, and loads the appropriate drivers. In fact, every boot for Linux is the same - boot, detect hardware, load OS. Whether it's been on the same hardware since day dot, or you change hardware every day, it's the same loading process.

    I have on numerous occasions now rescued businesses but throwing known working hard disks from servers that have blown RAM/motherboards/etc into a completely different system, and it's booted fine and the business can continue as normal. See my earlier comment about telling your CEO that you can convert a site office into a working server room in under an hour in the event of your head office being totally destroyed. In terms of business, that's great reassurance.

    RAID, LVM and enterprise disk management

    RAID: Redundant Array of Inexpensive/Independent Disks

    RAID is a way of writing information to multiple disks in one go. Whether it's splitting the information up so each disk writes half as much (so the write is twice as quick - called "RAID0" or "striping"), or simultaneously writing the same data to 2 or more disks at once (meaning that if one fails, the other can take over on the fly and the system notify you that you need to replace the busted one - called "RAID1" or "mirroring"). More complex RAID exists where data can be split over multiple disks (for speed) but also calculate a CRC (Cyclic Redundancy Check) dataset that allows a broken disk to have it's information rebuilt mathematically after a new disk is installed (called "RAID5" or "striping with parity"). Very useful, but touch on resources like CPU calculations.

    There are 2 types of RAID: Software and Hardware.

    Now here's the tricky bit: a lot of people incorrectly assume "Hardware RAID" means it comes on a card or chip. THIS IS WRONG. Hardware RAID is where a RAID controller has a dedicated RAID calculation CPU (Intel Xscale running at around 300MHz is a popular choice). You will know a card is hardware RAID because (a) The card will cost more than $600 and (b) because the RAID set will appear to your computer like a logical/virtual drive, and you won't see the individual disks.

    Software RAID can be done in pure software, or on a card. Cards like these cheap shit "Promise" devices that sit in the market at anywhere from $100 to $300 are NOT hardware RAID. Despite offering RAID via a card, the actual grunt work for the RAID is done in software via a driver that has some extra code on top that makes your system's CPU do all the hard work. The card itself is dumb, and just passes information back and forth.

    Linux has a piece of software built into it called MD (Multi Device). In Linux, hard disks are classified as block devices, called HD (IDE/PATA hard disk), or SD (Serial/SATA/SCSI/USB/Firewire hard disk). MD is a kernel-level virtual device that is capable of marrying ANY two disks (you can have IDE and SATA disks mixed in Linux RAID!) and making a virtual disk that can be configured in any RAID level (0, 1, 5, 6, 10 and 15 are currently the most popular RAID levels). Adding and removing drives on the fly is done by simple command line inputs, and realtime statistics on drive rebuilding, drive health, and other such useful info is output via the Linux virtual "proc" filesystem in plain-text, realtime updated readable files (adding the ability to monitor this over network/internet/web-browser is very simple).

    Linux only supports RAID1 (mirroring) on your /boot/ partition. Why? Think about it: the Linux kernel needs to be loaded so it can know what RAID is, but how does it do that on a RAID hard disk? RAID1 is an exact mirror on both disks, so what the kernel does is load from one disk, and then once it knows about RAID, loads in the second disk as the mirror. RAID0, 5 and 6 which all involve striping (one file lives half on one disk, and half on another) is not supported for /boot. However, the rest of your Linux system and especially your user data are free to live on other RAID levels.

    When using Linux, I recommend one of the two following alternatives:

    Use either a REAL HARDWARE RAID card, or use Linux's built in kernel-level software RAID. Do not, under any circumstance, use a cheap and nasty software RAID card. Why? Several reasons:

    1) Driver support for software RAID cards is often proprietary, and involves all sorts of painful configuration for Linux

    2) Performance of software RAID cards is typically WORSE than Linux's own internal software RAID

    3) Software RAID cards often provide no way of adding and removing drives remotely. For someone like me who administrates servers hundreds of KM away, I simply cannot afford the downtime of flying out to a site just to replace a hard disk when a few simple commands could add in a hot spare, and the onsite admins can take their time replacing the busted drive under warranty.

    4) Hardware RAID provides true, independent RAID calculations on the card itself (ironically, most true hardware RAID controllers run a small embedded Linux or BSD subsystem!). Drives appear to the operating system (Windows or Linux) as a single SCSI drive, and the end user is largely ignorant to what goes on "behind the scenes". True hardware RAID cards are preferred to software RAID cards because it can continue working even if the OS has crashed. The downside is that all but the most expensive multi-thousand dollar cards support remote access, and it means that if a drive dies, you're on your bike out to the site to replace a hard disk manually. Although you can add in auto hot failover on failed drives. And if you work in the office where the server is, it's not as huge a drama.

    As Linux MD RAID devices are considered true block devices (Linux considers them a "real" hard disk), they can be used in all sorts of tricky ways. Enterprise users will know terms like iSCSI and Fibrechannel. Under Linux, an MD can be exported as either. Using a cheap Linux box and dozen SATA drives, you can build your own iSCSI disk to use on any system you like (where I work we have 2 Linux-iSCSI machines that serve as hard disks for Windows servers), but at around one third as much as commercial iSCSI devices cost!
     
  10. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    LVM - Logical Volume Manager

    Migrating drives sucks. You've all been there: your workstation or server disk is full, and the users are bitching that they want more space. You know buying a bigger file server costs money, and the migration is a headache.

    Enter LVM!

    Linux LVM is a virtual layer that sits over any device. It can live on top of real hard disks, USB hard disks, MD RAID devices, iSCSI and Fibrechannel connects, etc, etc. What it does is create a virtual disk over the top of the physical disks which can be created, destroyed, resized or extended in a few simple commands.

    So, lets look at a real-world scenario. I have a graphic design client who chews through disk space like a fat kid goes through ice cream. I built them a Linux server (with SAMBA and LDAP for Windows and Mac file sharing and Domain log-ons). In it I put 2x 80GB hard disks in a Linux-software RAID1 mirror for boot and OS (called /dev/md0 by Linux), and 4x 320GB hard disks in a Linux-software RAID5 stripe+parity (giving them 960GB of space - one disk's worth of space is lost to CRC, but it means they can lose one disk out of the set and keep working without data loss). This is called /dev/md1 by Linux.

    Now here's where the smarts come in. Over the top of the 960GB set, I put LVM (which Linux calls /dev/mapper/data, containing one drive called /dev/md1 (which is really 4 drives, but LVM doesn't care). On top of that, I put a normal Linux file system, and then share it all via SAMBA. Each morning, users log on to their Windows workstations, and see their network server share (mapped to drive S:\ in Windows) as 960GB of space.

    6 months later, the disk is full! What the hell these guys do, I don't know, but hey - it's their business and not mine. They want more disk space! What to do?

    Simple. I run out and buy another 4x320GB SATA hard disks. I plug them into the Linux box. I tell Linux MD RAID to combine them as a single RAID device /dev/md2. I then tell LVM that /dev/mapper/data now extends across two hard disks, /dev/md1 and /dev/md2. Tadaaa! /dev/mapper/data jumps from 960GB to 1920GB (1.9TB - Terabytes). I "unmount" the Linux file system, tell it to resize itself to fill up the rest of the disk, and remount. In 15 minutes, the users now have double the disk space, and there was no need to migrate data to new servers!

    LVM has another neat trick called "snapshotting". This is a means of pausing a file system and taking a "snapshot" in time of it's contents. This snapshot can be stored on spare space at the end of the disk, or compressed as an image and sent to another server. If your server blows up, you can either repair it and restore from the image, or simply fire up the image on your spare server, give it the same network details and all of a sudden it's running on a different box, yet none of the users know the difference! Remembering again that Linux servers store all of their config data in plain-text files. No need to restore registries or other complex binary-only files that you can't read. Reconfiguring a server is a matter of overwriting some plain text files and rebooting!

    Again, visiting my earlier comment about having a site office become a server room in under an hour. And again for the Microsoft Windows Server users, remember that all of what I describe costs $0 in software. None of what I describe here is Linux-only by any means. But all of what I describe here costs tens or even hundreds of thousands of dollars in software, licensing and proprietary hardware (most of which is built on embedded Linux/BSD anyway).

    So that's part 2 of my enterprise Linux intro. Part 3 is coming later, and will cover virtual machines, virtualisation, hypervisors, and a way to use Windows Server and Linux together and not need to worry about drivers and hardware dramas ever again. :)
     
  11. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    *user question about Menuet*

    Menuet is FAST. The beauty of writing your own stuff from the ground up is you can go nuts on speed and avoid the overhead of big bloated systems like UNIX and Linux.

    Menuet is a great candidate for things like Point Of Sale machines, web terminals, etc. If that sort of "ultra small device" thing floats your boat, check out GumStix:

    http://www.gumstix.com/

    An entire hardware system with wireless ethernet in the size of your average chewing gum stick!

    Wikipedia has a list of FLOSS (Free/Libre/Open Source Software) OSes here:
    http://en.wikipedia.org/wiki/Comparison_of_open_source_operating_systems

    Nice comparison of what architecture they support, what their legacy design was based on, etc.
     
  12. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    *user question about "which distro should I try?" *

    [edit]
    As mentioned elsewhere, this post was originally penned for a less technical crowd. Rather than flummox them with options, I recommended a single distro that was focussed at desktop users.

    I still recommend people new to Linux go with a distro that has good community support. With that in mind, Ubuntu, Fedora and OpenSuSE are great starting points.
    [/edit]

    I always recommend Ubuntu (or Kubuntu or Xubuntu) to new users. Why?

    1) ENORMOUS community support. You will never be stuck in a position where you can't ask someone for help for free

    2) End-user focus. Ubuntu thinks about the user first, and the computer second. Linux is traditionally a "nerd OS" where you need to know how things work at the back end. Ubuntu attempts to stop that. Some die-hard Linux/UNIX users complain that this "dumbs down" computing too much and that users *should* know what goes on beneath. I don't agree. Just like you don't have to be a mechanic to drive a car, you shouldn't need to be a nerd to use a PC. Ubuntu is packed full of wizards to auto-configure everything for you. If you don't like it, feel free to configure it manually (plenty of Linux geeks like me do just that). But if you're happy with the working pre-set defaults, it means your system works without the need to get your hands dirty.

    3) Regular release schedules. New versions are brought out periodically and predictably. If you are happy with the features and hardware support of your current version, the system won't force you to upgrade (and potentially break) unlike some other distros will.

    4) No difference between "commercial" and "free" versions. There is one version of Ubuntu. There is no "Ubuntu enterprise" version with a whole bunch of extra features and better support.

    5) APT package manager. Debian's APT package manager (and GUI frontends like Synaptec) make installing new programs a breeze. No "dependency hell" like Redhat and others where you constantly need to manually search for other bits to make your programs work. APT does all the hard work, and makes install, uninstall and upgrade a breeze.

    At the end of the day, all Linux distros are largely the same under the hood. No one is "better" than another, they just tweak a few things here and there from a setup and install point of view. Ubuntu puts more effort into making Linux easy for end users from a GUI point of view. If you're a Linux virgin, it's a bloody great place to start.
     
    Last edited: Dec 10, 2008
  13. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Enterprise Computer Part 3: Virtualisation

    What is Virtualisation?

    Traditionally an operating system ran on a piece of hardware. In days of old, resources like CPU power, RAM and disk space were low. Wasting precious resources on multiple OSes on one machine was not very smart.

    Now days we have Terabytes of disk space, Gigabytes of RAM, and multi-core processors. The idea of running two or more OSes on one physical server isn't such a difficult thing

    More than one OS on a server? Why bother?

    For plenty of reasons. Security is a big one. Running multiple services on a single physical OS means you have multiple entry-points into your server. You're no longer relying on just the server's OS as your security system. Each and every running service becomes a point of vulnerability. Locking each of these into a virtualised OS means that, while you still have the point of vulnerability, it means that if someone did break into your website, for instance, your finance database isn't on the same server, and thus has at least one layer of security between it and the nasty person breaking in.

    Another is rapid recovery after disaster. It's nice to have a server, but what happens when that server breaks? Traditional backups only get you so far. There's time to restore the backup to the old hardware (assuming it's not the hardware that broke) and potentially wipe new data that exists between your current system and the last backup, or there's the woes of restoring data to an entirely new piece of hardware which can mean driver woes and hardware issues depending on your OS.

    Virtualised servers mean that your server is nothing more than an image (think like an .ISO file, which is an image of a CD, except now you have a hard disk image). Moving an image around isn't difficult. "backups" mean pausing your virtual server, copying the disk image across the network to another box, and unpausing. If the first physical server breaks, merely turn the image on at the other location. All settings and virtualised hardware remain the same, and the machine magically turns on as if nothing went wrong. Then you can take all the time in the world to fix the first virtual machine, and copy any vital information off the image.

    Redundancy becomes a nice side effect of the above. It's possible to have entire servers migrate to other physical locations in mere seconds, meaning downtime is greatly reduced. Likewise if you need to upgrade physical hardware, you can migrate a virtual server to another box, and the end users don't even know the difference.

    Cost saving is another. Virtualised servers don't need switches, network cards, cabling, fibre optic HBAs, RAID controllers, etc, etc.

    Disadvantages
    Performance: if your workload is very strenuous, particularly when it comes to I/O (say, VERY large databases, etc) virtualisation might not be a preferred option. Where I currently work, we keep all of our application and management systems on virtualised hosts (the bulk of our servers in pure number of installs), and our databases and data warehouse on physical machines to keep performance high (only a few servers, but they are very important and need to be working at maximum throughput 24/7).

    Software licensing: Licensing can be a downer for virtualisation. Linux folk don't have to worry of course, but Microsoft want to charge you per machine. Have 10 virtualised Windows servers on a single Linux host, and Microsoft will demand 10 licenses paid for. Bummer.

    Types of virtual machines

    There are two popular types of virtual machines. The first is the traditional hardware emulator/simulator. This builds a virtual CPU that the slave OS talks to, which then translates instructions back to the host (real world) CPU. This is a nice thing to have if you are doing things like low-level operating system or driver development, because you can do tricky things like slow down or even pause your entire virtual system. On the flipside, for running big apps and services it's slow.

    The second is a newer technology called a "Hypervisor". A hypervisor is a virtual machine supervisor which allows VMs to access hardware more directly. Think of it like your virtual OSes "partitioning" up resources like CPU, I/O (for access to network cards, disks, etc) and RAM. Hypervisors are much better suited to VMs where you need "bare metal" speed from your machine. The downside is that with a hypervisor, both OSes must directly support your architecture. For instance, you can't run Windows on a Hypervisor on a non-i386 based processor. The parent hardware must be hardware that Windows itself could run on.

    Virtualisation compatible hardware

    For the "emulated" type virtualisation, no special hardware is needed. Everyone here probably as used MAME, which is a classic example of emulated hardware. Likewise, the early releases of VMWare and VirtualPC were the same. No special hardware was needed.

    Hypervisor technology can run on standard hardware, but benefits greatly from machines that support virtualisation at a hardware level. This is moreso to do with the client machine than the host (eg: Linux being open source has Xen Hypervisor information built right into the kernel, so Linux on Linux using Xen is fine. Windows doesn't support Xen yet, so Windows on Linux using Xen needs hardware support).

    In large industry (particularly big UNIX makers like HP, IBM, etc), this sort of thing has been happening for a while. On the home and small business front, this technology is finally available on low-end hardware!

    Intel and AMD both have virtualisation extensions available. AMD have "AMD-V" found in Socket AM2 (Athlon64 and AthlonFX) and SocketF (Opteron) motherboards, and Intel have Intel VT (also IVT or VT-x, formally called "Vanderpool") which is supported on certain boards (check the board's specs for VT-x support, or check the CPU flags inside CPU-ID, CPUz or /proc/cpuinfo for "VMX support". Most Core2Duo based Pentium and Xeon systems will have this.

    If you want to investigate virtualisation in detail, buying one of these platforms is highly recommended (as is getting buckets of RAM).

    The Software

    Xen
    http://www.xensource.com/


    This is my pick of the bunch. Xen kernels are available for a lot of Linux distros. The new commercial RedHat and SuSE versions come with "out of the box" support for Xen. RedHat's system is excellent. You pay for a single license of RedHat Enterprise Linux (RHEL) and you may install an UNLIMITED amount of Xen virtual machines on a single box! Anyone who's needed test machines for network service testing or an insta-cluster will love this sort thing.

    Debian and Ubuntu both have Xen kernels supplied in the repos. Ubuntu have an easy Xen guide here:
    https://help.ubuntu.com/community/XenVirtualMachine/XenOnUbuntuEdgy

    I use this method on many servers to build dozens of test and rollout servers for clients. Very easy stuff, and once you've got the building process down pat, pushing out working VMs takes mere minutes. (It took me literally 10 minutes to build a working webserver for a client the other day from nothing - complete VM, OS and web server software from scratch up, working and secure).

    Xen is entirely free (as in freedom). The free (as in cost) version is all command-line based, but if you want a nice and easy GUI, grab the commercial pay-for version.

    VMWare
    http://www.vmware.com/


    This is the name most people equate with virtualisation, probably because they were one of the first to do it at a desktop/consumer level. VMWare initially marketed their products at people who needed multiple platforms for testing and development, but these days have also jumped on the "disaster recovery" bandwagon.

    VMWare has always seemed expensive to me, but the the VMWare team have a lot of experience in this market, and from all accounts the commercial support is good.

    Parallels
    http://www.parallels.com/


    A newcomer to the market, Parallels works on all OSes. It's a nice way to run Linux on your Windows or Mac computer without needing to reboot, or as a commercial alternative to Xen/VMWare for Linux. I know a few folks who set up fast user switching (or dual monitors) in both Windows and MacOSX, and have a parallels session in the other setup running another OS. And remember, because Parallels is a hypervisor, you'll need Apple Boot Camp to get Windows running on your Mac.

    QEmu
    http://fabrice.bellard.free.fr/qemu/


    This is an odd one. It lies half way somewhere between hypervisor and emulator. Totally free (as in freedom), it's a much easier way to get Windows working Linux. Currently where I work we have a nice big 8-processor Xeon box running Linux, and all of our Linux virtual machines on Xen, and another 4 Windows 2000 Server servers on QEmu. One plus to QEmu is that Windows runs FASTER under it than on native hardware! Windows 2000 servers boot and get to logon in under 30 seconds - something that previously took close to 3-4 minutes! I'm still yet to understand why the speed increase is so severe, but in the meantime we love it, and have vowed never to run Windows on native hardware again. We've replaced a server room full of machines with 2 hot-swap redundant super-servers running Linux, Xen, QEmu and Linux and Windows VMs. As before, our backups are now so simple, and hot-swapping entire machines in the event of hardware failure is a breeze. The cost and maintenance savings are simply enormous, and it means we can get on doing preventative maintenance and network improvement rather than running around wiping the arses of dozens of physical machines.
     
    Last edited: Dec 10, 2008
  14. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Some more Virtual Machines:

    KVM
    http://kvm.qumranet.com/


    Stands for Linux "Kernel-based Virtual Machine". I personally think it's a silly name, as KVM means "Keyboard Video Mouse" where I come from, and means a piece of hardware that lets you use one keyboard, video and mouse for multiple machines. Boo to people who use already taken acronyms. But anyway...

    KVM is the new Hypervisor-based Virtual Machine being worked on directly by the Linux Kernel team. So far it seems to be the fastest of the hypervisors according to benchmarks. I've not used it myself, but from what I read it works very closely with QEmu to provide bridged virtual ethernet adaptors and other interfaces to talk to the kernel's IP (and other I/O) stacks. It's definitely the baby of the VM world, existing only from Linux 2.6.20 kernel and up (which itself is only a couple of months old at time of writing). While the big commercial distros like RedHat and SuSE are backing Xen, Ubuntu announced that their latest release "Feisty Fawn" will support options to install both a KVM host as well as a KVM virtual machine straight off the install disk/ISO. That means simple point-and-click VM setup for users, which is always a good thing.

    VirtualBox
    http://www.virtualbox.org/


    VirtualBox by InnoTek (make sure you put a cover letter on your TPS reports).

    This has since been bought by Sun Microsystems, and progressed a great deal. It includes both a GUI and back-end/CLI management tools. Simple management of virtual disks, and a wide range of options for networking including NAT and bridged connections. Simple mounting of virtual disks, and in the new versions it includes options for USB passthrough to the guest OS.

    Drivers are included as "guest extensions" for Windows, allowing you to have better host->guest integration of your mouse, shared folders, 2D video acceleration, etc.

    Supports virtualisation extensions in hardware also (ie: speedy).
     
    Last edited: Dec 9, 2008
  15. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Multimedia and Audio Players

    Linux is quite literally swamped with multimedia players and dedicated audio players. There are plenty of command-line players out there that work very well when interfaced with web frontends if you ever want to set up a web-accessible jukebox or similar sort of thing. One thing I hate about GUI/Graphic stuff is that it makes for a really poor dedicated jukebox or hidden sound system. But anyways... I'll leave those for another day. Today I'll talk just about desktop players.

    As mentioned, there are literally hundreds of the buggers. Rather than go into excruciating detail about all of them, I'll just stick to the most popular dozen or so.

    VLC - VideoLAN Client
    http://www.videolan.org/


    This is my personal favourite media player hands down. It works on everything (Windows/Mac/Linux) and has all the necessary codecs built in. Developed in France, there are no stupid DMCA style laws over there, so people are free to make media players that can play proprietary codecs like WMV without fear of being sued (honestly, who the fuck in their right mind sues someone over making a free media player??? God I hate America sometimes).

    VideoLAN can do neat things like:

    - Stream media from a central server to a listening client
    - Stream media to multiple clients simultaneously (have all your TVs playing the same AVIs from a central computer)
    - Stream media and break it up into segments (make your own "video wall" with multiple TVs!)
    - Play back media from any device or file - if you have ISO files that are DVD images, you can play straight from the ISO - no need to burn!
    - Use VLC as a plugin for Firefox to watch WMV, Quicktime and other formats off the net on any computer (great for Linux and Mac users, or Windows users who hate Media Player)
    - Stream from any protocol - http, ftp, udp unicast/multicast, whatever!
    - Full support for post-processing, anti-aliasing, interlace fixup, etc, etc.

    VLC can also be used for one-pass transcoding. Output streams can be converted into other codecs, and streams from WebTV or broadcast can be saved to disk as a video file.

    Brilliant software. I know a lot of die-hard Windows users who use this instead of Windows Media Player because it really is that good, and dead simple to use.

    VLC has recently been ported to handheld devices too. If you're a Palm/iPaq/WinPhone/iPhone/mobile-phone user, keep an eye out to see if it supports your device.

    MPlayer
    http://www.mplayerhq.hu/design7/news.html


    Will literally play ANYTHING under the sun. Movies, audio, even DVB streams from TV/cable/satellite/capture cards. Comes in both command-line form and GUI for GNOME, KDE, TCL or anything you like. There are even dedicated versions for Windows and MacOSX if you are so inclined.

    The command-line version is very cool, because you can use it to quickly "transcode" between one file format at another. eg: play your DVD, and set the output to be a file instead of the screen, which is piped through XVID or a similar compression tool. End result is your DVD saved as an XVID or AVI file! Remember that in Linux "everything is a file", so redirecting output from sreen/speakers to a file or even another computer is trivial, and tools like MPlayer suddenly become much more useful than for merely watching videos. :)

    Xine
    http://xinehq.de/


    Similar in design to MPlayer, it's another popular video/DVD player with some neat back-end tools to do all sorts of trickery.

    Totem
    http://en.wikipedia.org/wiki/Totem_(media_player)


    The default movie player for GNOME, it's a bit odd in that it's more of a frontend for other movie playing systems. By default it uses GStreamer, but can be plugged into Xine, MPlayer, or other systems. Pretty basic in it's functionality, if you use Ubuntu or other GNOME-based distros, you'll probably have this installed as default. If you find it isn't playing the files you have and don't want to go through the process of manually adding codecs, have a look at VLC as an alternative.

    Amarok
    http://amarok.kde.org/


    Default audio player for KDE, this has everything you'd expect from a music player. Favourite voting, playback of all music filetypes, organisation and grouping systems. It will also happily sync with any Apple iPod.

    Rythmbox
    http://www.gnome.org/projects/rhythmbox/


    GNOME's answer to Amarok. Same features including iPod support.

    GTKPod
    http://www.gtkpod.org/


    Banshee
    http://www.banshee-project.org/


    YamiPod
    http://www.yamipod.com/


    Three music players I've never used, but are all very popular. Again, all filetypes and iPod support.

    XMMS
    http://www.xmms.org/


    A WinAmp clone, XMMS is a great music player with small footprint. More than that, it has an ENORMOUS array of plugins for things like MOD, S3M/Screamtracker, MIDI, and other oldschool sample or instrument/instruction-driven filetypes.

    Best of all, heaps of plugins have been written to play music out of old games. Super Nintendo/Famicom plugins, Gameboy plugins, Commodore 64 plugins, etc, etc. Grab your ROMs and use this baby to listen to the music within. As with other Linux programs, change the output device from your speakers to your disk, and you can write WAV files (and later compress them to MP3). An easy way to convert an old SNES ROM into a music CD with your favourite game music!
     
    Last edited: Jan 17, 2009
  16. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    *user question about 3D desktop acceleration, and MacOSX's "Quartz Extreme" *

    Apple's MacOSX has used 3D acceleration of *all* screen drawing since 10.3 (leopard will be 10.5). It's nothing new on offer - it's just that journalists are the bottom of the evolutionary pit, and are only realising now what it can do.

    Initially this was called Quartz Extreme, and offered only OpenGL 1.4 extensions:
    http://www.apple.com/macosx/features/quartzextreme/
    http://en.wikipedia.org/wiki/Quartz_Compositor

    With OpenGL 2.0 and HLSL (High Level Shader Language), they moved up to "Core Image", which allows pixel-shader level programming to enhance Quartz Extreme's fairly basic "draw a window on a rectangle and move it about" with some new sexiness:
    http://www.apple.com/macosx/features/coreimage/
    http://en.wikipedia.org/wiki/Core_Image

    This all goes way back to Apple's native Postscript/PDF rendering, which is a leftover from it's NeXT days:
    http://www.apple.com/macosx/features/pdf/

    NEXTSTEP was an old UNIX-based OS that was the great grandaddy of OSX. Developed by Steve Jobs after he was sacked form Apple (and developed by Apple when they begged him to come back after they lost millions of bucks making the extremely shit OS8 and OS9) it was built from the ground up to accellerate primarily print information back from the old Xerox days (Xerox invented the GUI, and came up with the design for most desktop publishing and print stuff you see today - long before Apple or even Microsoft or Linux were on the scene).

    So NeXt was all about Postscript - the native print language. It was designed to make using it as fast as possible. Postscript is all vector (ie: mathematical shape primatives like triangles and elipses - kinda like what 3D cards do, only in a single 2D plane). PDF is merely an extension of Postscript, and Steve Jobs and his cohorts over at Apple figured out a long time ago that your average 3D video card was utterly wasted when you were doing normal desktop stuff. They were quick to jump on the technology and use a super speedy graphics card to accelerate their entire desktop. Seeing as MacOSX is built on Postscript and PDF, accelerating it via OpenGL was trivial, as standard primitives like triangles and elipses are what video cards do best.

    http://en.wikipedia.org/wiki/Display_PostScript

    Apple were really the first to do this, and they are still the best. Both Microsoft and Linux/Xorg's 3D desktops are last minute hacks. Both Vista and Linux/Xorg use the accelleration at a very high level - essentially all information inside a window is drawn by software, and then slapped on a polygon. Speed wise, this is nice for pretty effects like twirly windows and such, but for actual useful speed boosts, it's useless.

    MacOSX accelerates basic drawing functions wherever possible. For raster items (ie: JPEG images, etc) nothing is accelerated. But for vector (PDF, SVG, etc) everything is much faster. Luckily 90% of MacOSX's desktop (named "Aqua") is vector! There was some smart forward planning by Apple.

    [You can easily tell how much of MacOSX is built on Postscript/PDF, because you can export ANYTHING to PDF from MacOSX. Anything you print, and even when you take a desktop screengrab - it's all captured to PDF straight from the processing pipeline. If it's already in a postscript format, it makes sense to just write it to disk instead of converting it to something useless and unscalable like a JPG.]

    Windows and Linux are slow to catch up. Their desktops are still heavily raster, and vector is being added in slowly and in quite a hacked and kludgey fashion. *Some* Linux utilities like the PDF browser Evince can offload PDF/vector rendering onto the video card. I've played with this under Gentoo Linux, and with the right video card it can make simple PDF viewing MUCH faster. Documents open in mere milliseconds, compared to the 15+ seconds it can take in Windows using Adobe Acrobat (simply the slowest and worst PDF browsing software EVER made, IMHO. How it got so famous I'll never know).

    So yeah, nothing new there. Apple's been doing this for years, just nobody noticed. There are times when Apple do really stupid things, and there are times when they are lightyears ahead of everyone else. It's funny now that Linux and Windows are catching up how everyone (and when I say "everyone", I mean "journalists") is looking at MacOSX and going "oooh... I get it now!". These are typically the same folks who 3 years ago were saying things like "pointless waste of time" when referring to the same technologies they now talk up today. :)
     
    Last edited: Jan 18, 2009
  17. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Great article: "Why Linux Has Zealots":
    http://penguinpetes.com/b2evo/index.php?title=why_linux_has_zealots&more=1&c=1&tb=1&pb=1.

    People get very angry at Linux Zealots, and while I understand that the average Linux Zealot is pushy and vocal (myself included), it's important that people understand WHY we are pushy and vocal. Linux, and all the associated projects, are about YOUR FREEDOM. It's not about money, it's not about hating Microsoft, and it's not about trying to be cheap.

    Linux is popular for the same reason we live in a democratic society. The power needs to be in the hands of the users, and not of one minority group. As we rely more and more on technology, users need to ask themselves if the technology they use has their interests at heart or not. If not, then users need to ask themselves if there's something else out there that does, and if that's a better option.

    Linux is primarily a political movement (well, GNU is at least, but Linux has a lot to do with it). And while it suffers the same disadvantages as any community, committee, or government run organisation, the advantages are loud and clear: the end users win at the end of the day, because the software is about their personal, professional freedom. As the future moves on and software begins to help us with more and more - our identity, our political and voting systems, our tax systems, our medical systems - end users need to ask themselves if they trust the software and the people writing the software that control these systems, and ultimately our very lives.

    It's more than just your desktop OS. Much more.
     
    Last edited: Jan 18, 2009
  18. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Linux Audio and Music Making Tools

    High-end music software is probably right up there with high-end graphics in being some of the most expensive around. Any CuBase, Rebirth or Reason users will know just how expensive good music software can be.

    Apple came along with Garage Band recently which is a nice low-end product that comes with iLife (and generally free of cost with a new Mac). Not full-on professional software like their SoudTrack or Logic Pro software, but good enough for the budding home musician.

    Interestingly enough, Nine Inch Nails have recently released their entire latest album in Logic and Garage Band formats! Good news for people who want to mix them up:
    http://www.nin.com/current/

    Linux Audio software is on the move. Certainly not a replacement for the big commercial players just yet, but I was pretty gobsmacked the other day when I had a look to see how far they've come in the last few years.

    In no particular order...

    Jack:
    http://jackaudio.org/

    Network audio format. Just like in a real music studio where you would have inputs and outputs from various bits of hardware, Jack is a software version of that. Network based it means that any program that "speaks jack" can connect to any other program. Have your MIDI program talk to your software synth, which outputs to your software mixer and into your software multi-track editor that's also taking input from your software drum machine. Whether they are all on one computer, or on a network of machines, it doesn't matter. Create yourself the ultimate recording deck, all in software!

    Ardour:
    http://ardour.org/

    Massive professional multi-track editor with gobs of features. Talks Jack (see above), so it plays nice with other software.

    Audacity
    http://audacity.sourceforge.net/

    Smallish wav editor. Great for just recording some sounds/samples/voices, or a single track for input into other software later.

    Jamin:
    http://jamin.sourceforge.net/en/about.html

    Jack mixer, equaliser and audio kit. Handy for making sure all your different inputs play nice with each other.

    Jokosher
    http://www.jokosher.org/

    "Garage Band for Linux" is about the best description. Multi-track editor with all the usual features including Jack support. This one is getting a lot of attention, and deservedly so from what I can see in the feature list.

    Rose Garden
    http://www.rosegardenmusic.com/

    Another big powerful multi-track editor and MIDI sequencer aimed squarely at the CuBase users. Looks quite poerful, and talks Jack as usual.

    CSounds
    http://csounds.com/

    One for the software nerds! Program your own music in C++! :)

    ffado
    http://www.ffado.org/

    Not music software per se, but a project aiming to get all firewire devices talking to Linux happily.

    Hydrogen:
    http://www.hydrogen-music.org/

    Linux drum machine. Doof doof! And of course, it talks to Jack like everyone else should.

    Jahshaka
    http://www.jahshaka.org/

    A few of you will be saying "hey, didn't he talk about this over in the video editing section?". Yes, I did. But Jahshaka actually started life as a multi-track audio editor and sound compositor for laying down soundtracks over movies. Despite being able to do video, it also does some audio stuff that can come in handy. And you guessed it, it talks Jack. :)
     
    Last edited: Jan 26, 2009
  19. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    Who writes Linux?

    Interesting question. Some people would have you believe it's hobbyist homebrew ware. Some people would have you believe it's tinker code from non-professional programmers.

    However some people actually like to RESEARCH these sorts of things before making claims:
    http://lwn.net/Articles/222773/

    What we see here is that the bulk of Linux kernel development is done by paid professionals from some of the biggest companies in the world. Whilst I have been quite guilty of painting all corporates with a broad brush of hatred, it's good to see that even in this corporate-mad world, people can work together for the common good, giving away their hard work without the stupid fears most have that open source is "cancer" or "throwing away your intellectual property".

    Outside of Linux itself, Open Source is a viable and profitable industry for everyone involved. It doesn't bias itself to large or small business, and it doesn't care if you're the USA, Brazil, or Zimbabwe. Open source means every can contribute and everyone can benefit no matter where they are on the world's economic scale.

    And contrary to what seems to be popular belief, even the big players are lapping it up.
     
  20. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    30,823
    Location:
    Brisbane
    More open source gaming for you Linux fiends:
    http://fijuu.com/

    Fijuu2 is now available in Ubuntu. It's a 3D music engine, allowing you to manipulate onscreen objects and make your own music. Neato.
     

Share This Page