Showing posts with label Linux. Show all posts
Showing posts with label Linux. Show all posts

Thursday, August 25, 2016

What is a Linux Distro?

We've got a number of folks in a Facebook Linux group I moderate who are playing around with the guts of their Linux by swapping around pieces from different distros. Which is cool--experimentation and tinkering are always good things.

It has raised a point that I think may require some background. Specifically, "What's a distro?"

Back in Ye Olden Days

When Linux first came around, those of us who wanted to play with it got source code. And that's all. We'd compile the kernel under some other operating system (Windows, another Unix, or maybe even the Mac of the time if you were a masochist!)

Then we'd start compiling binaries for our Linux kernel--a shell and cc would be a common place to start so that you could start compiling stuff for Linux under Linux.

After a while, we realized that we were duplicating a lot of effort. Many of us had fairly generic systems that we could swap binaries between without having to recompile absolutely everything.

With time, the concept of having a set of precompiled binaries for the kernel and applications as a starting point for a Linux system became the basis for a distro. Usually you'd install from a set of floppies or a CD. If the kernel you chose from the distro was close enough for your purposes, you'd just use one of the ones they'd precompiled. If you wanted something more suited to your hardware, you'd recompile the kernel with a few configuration changes and usually get much better performance and a more stable system, without having to recompile the application binaries for user space.

We still regularly compiled applications. Anything that wasn't bog standard usually would have to be compiled. As time went on, the list of precompiled binaries for each major distribution grew, and then distros started to have their own package managers to ease the installation and management of precompiled binaries.

The ability to do this is a large part of what has turned Linux from a tech toy used by a select few who are comfortable with compiling their own operating systems and applications to being the generally popular operating system it is today.

Stepping Into Today

Now it's not uncommon for users to choose a distro according to the application set they want for their system.

Different Linux distros have sprung up with selected sets of software precompiled and ready to install and run, or as part of their base installation.

The specialization has reached back into the kernel, too, as maintainers of the distros have improved their distro by compiling a kernel that suits the base suite of applications they provide. Distros that are intended to be more general-purpose likewise make changes to the Linux core to make the installation and hardware detection process simpler and more robust to make installing and setting up Linux easier for users.

More software is more easily available to everyone because of this. All the major distros are tied to a system of package management that provides users with almost limitless choices of software to install on their system.

Going outside the distro's own software repositories (repos) is certainly possible, especially in the instance of distros that are upstream of one's own.

For example, if you're running Mint, it's a snap to include repos for Ubuntu and Debian in your package manager's list of repos. And chances are almost all the software there will run without problems. For all I know (I haven't installed Mint in a while) Mint includes the Ubuntu repos by default, I seem to recall that being the case in the past.

However, if you're crossing over between branches, like wanting access to software compiled for Fedora that's not in the Debian-family distros, then you're starting to put yourself in a position of having more errors. The filesystem layout is different between Fedora and the Debians. There may be differences in the kernel configurations that will cause you problems.

If you want to get software that isn't in your distro's repos, or in a closely related distro (preferably upstream), then the correct answer is really to compile it yourself on your system, rather than use software someone has compiled on a different system configuration.

If you want to save other people the trouble, see about contributing your effort to your distro's repos. They'll probably have standards of what you have to do to do that, but you can be part of the team that makes your distro that much cooler pretty easily.

Kernel Swaps

I've seen discussions of turning one distro into another on the Facebook group. Hopefully a better understanding of what a distro is, and how it is defined, helps to show why "Turning Ubuntu into Kali" or some such is a misnomer. If you're grabbing another kernel then starting your system off it, you're no longer running either distro. You've gone custom.

That's not a bad thing, experimentation is a good thing. But you also want to look at how effective the technique of playing mix-and-match with precompiled system files and applications is versus just doing your own compilation with the stuff you want.

If there's a specific distro that you want the advantages of, it's best to install it rather than to build a sort of chimera by grabbing pieces then lashing them on to your current distro. You're asking for trouble by using binaries that haven't been compiled using the same assumptions at compile time.

If you want something truly custom, then rolling your own Linux is very probably the way to go. It's not hard, and that used to be the standard installation method. There are a number of systems out there to make it even easier than we had it in the old days (Linux From Scratch, Gentoo, etc.)

I hope this helps clear up some of the confusion about distros and their parts. :)

Monday, November 17, 2014

Microsoft's Cross-Platform Play

Microsoft has recently announced that they are moving .Net to open source and supporting Mac and Linux in addition to their Windows OS with Visual Studio Community 2013. Also, Visual Studio 2015 will support iOS and Android development.

Frankly, it's a little confusing because in some places MS mentions Mac and Linux, but very few, and they mention iOS and Android in lots of places, but not everywhere. So the support may be different than what I said.

Nevertheless, MS is opening up. Or at least seeming to.

Their timing couldn't be better, in my opinion.

Oracle is working hard to poison Java. I'm not entirely sure why, unless they think that ticking off any Java programmers and users outside those developing middleware for Oracle's products is a really great strategy. Their support for Java as a general programming language, as Sun did, has been piss-poor since day 1. Every so often they make a grand gesture to try to present themselves as interested, but the product they offer now is not something I could send someone out to install with a good conscience. What kind of an honest, large-scale company thinks that half-sneaking some crap software in under the cover of installing their own is a really great idea? Not one that really treasures the individual customers, I can tell you (and yes, Adobe is on this particular excrement list, too.)

Couple that with their poor responsiveness to security concerns, to the point where the Java language is treated as a sort of worm or virus by most software, and you've got a company that's decided to leave a hole in the cross-platform development market simply because their interests are elsewhere.

Cross Platform Alternatives

C# is Microsoft's outgrowth from their own attempt at embrace, extend, exterminate targeting Java. It's a very good language, including the good parts of Java while having a set of libraries (.Net) that don't have the Java API's less than useless historical cruft but do have everything good about the API, which made Microsoft look really cool to programmers when it came out as most were not familiar with the Java API itself.

Switching to C# from Java is an afternoon's exercise. But it and the .Net library started out wed to the Microsoft platform.

Enter Mono, a cross-platform implementation of .Net supporting the C# language based on Microsoft's ECMA standard for its products. It has been growing, at least in part on the back of the decline of Java, as well as bringing the good things about C# and .Net to Mac and Linux development. It doesn't hurt that it is the core that drives products like the Unity cross-platform game-development system, too.

Now Microsoft is combining their products with Mono, and extending their reach to Android and iOS.

There are a Couple of Ways This Could Go (and Possibly More)
In Future A, Microsoft does the excellent work they do in producing development tools, but now with a return to platform-agnosticism. This encourages programmers to develop for Windows and Windows Phone as well, since programmers who might otherwise have been targetting only iOS or Android will be selecting these tools for their intrinsic value as development tools to write programs for their favored OS, then decide to toss a Windows/WinPho version out there, too, since it's not costing them any significant extra effort and might end up fattening the coffers a bit.

Microsoft gains developers for its platforms, draws "thought leader" and technical leaders into their ecosystem, and the developers get the advantage of having good development tools for any major platform they choose.

In Future B, Microsoft uses this as a ploy to draw programmers in, but cross-platform support is sloppy, or delivered slowly, or lags behind the native capabilities of the non-MS platforms. Perhaps it doesn't play well with the various different versions of Android, or maybe the Mac and Linux native code suffers by comparison with code developed with native tools for those platforms.

It turns out to just be hype. Maybe there are forces within MS that fight the release of solid tools that are truly cross-platform, so they ensure that foreign platform support is sub-par, thinking that they're helping their own products by making it "harder" to develop for other platforms. The only point of cross-platform, to them, is to allow Windows patriots to proclaim themselves cross-platform developers without knowing anything about the competition.

In this future, Windows does not become the programmer's platform of choice, Windows has extra costs of development that make it less attractive as a development target when another platform is the one that's going to pay the bills (probably iOS), and life goes on as it does today with possibly the Microsoft touch putting some poison into Mono.

I can only hope that the upper management at MS is committed to Future A. because that's what it's going to take to keep Future B from stepping in any time it pleases.

Microsoft, Doing You Know What in Their Own Messkit for the Past Decade
Windows 7 was a boon for programmers when it came out. I and many other programmers I know had been feeling somewhat kicked about by Apple, didn't see tools with the same level of sophistication on Linux (and generally more system maintenance than the commercial OSes, and system maintenance doesn't pay bills), and Windows 7 was a viable place to go, or to at least have as a second OS on our desk for conducting development and testing.

Windows 8 added some nice stuff behind the scenes, but not without wrecking the usability of the platform, as well as making it a far less attractive target for development. In my case, I abandoned Windows as a target for native development about a decade ago, and was waiting on Windows 8 to see if I might add it to my repertoire again. The answer became "no". Windows 10 will determine whether I ever consider it a serious platform for anything at all in the future, as the professional applications that I currently use on Windows have all become multi-platform over the past several years, so I can move my licenses over to Mac OS or Linux for all of them.

If Microsoft gets it right with their development tools, and Windows 10 restores the power to the desktop that earlier versions provided--not just a false appearance of it as in Windows 8.1--then they stand to become the defacto professional desktop again.

Whither Mobile?

And given that the consumer desktop is the market that is dying in the face of mobile platforms, one would hope that they are bright enough to strategically commit to a powerful professional computer OS again. While the mass market computer is probably going away, there was a profitable professional/hobbyist market before the internet boom put a computer in practically every household. That professional market stands to be even larger than the historical one, as many professions that had nothing to do with computers now rely on them, and many new professions have arisen over the past 25 years that rely on the computer.

Also, there's a possibility that the mobile device as a computer replacement is just a flash in the pan. Most people bought their mobile devices in place of a routine upgrade to their home computer for one cycle. Now that they have tablets, the tablet market is going flat while computers are seeing a modest rise in sales. I think just about everyone has had a chance to discover that the touch interface is very limited in what it can do with current technology. It's severely error-prone and it's not well suited to complex activities. It's too soon to read now, but there's a chance the tablet may be relegated to specialty use status, with the keyboard-equipped computer regaining its status as the "real" computer behind the smartphone's limited purposes as a personal communication and light entertainment device. We'll see.

The phone isn't going to go away, though. It's going to be the most personal of personal computers, at least until it's replaced by a small tablet and a complete phone the size of an earpiece, or some other revolutionary turn that gets people to give up a screen for convenience. So supporting the mobile platforms is, for Microsoft, probably a must to their survival. At least until they can get real market share for Winpho.

Either way, if Microsoft plays their cards right, going to platform-agnosticism in their development tools could be a really good thing for everybody--including and especially Microsoft.

Wednesday, May 4, 2011

Learning GCode with EMC2

I'm spending a lot of time with my new microCarve A4 CNC router this week. My first couple of items were made using a handy image to gcode converter that's built into the EMC2 control software I'm using.

But the image converter simply treats the image as a depth map which is cut by raster-scanning with the cutting head. For the designs I used, this was slow, and produced rougher results than would be produced using vector cuts.

So I looked at a couple of approaches to improve things. One is using CAD software that works well with a CAM package to covert the CAD design into machine control instructions to cut out the CAD shapes. The other is to go straight to writing my own machine control programs by hand. I know that I'll want to have both methods in my toolkit, but which to use first?

After a bit of back and forth yesterday morning, I decided to start with programming by hand first. So I dove into the EMC2 documentation for gcode, the programming language more properly called RS-274-NGC. What a catchy name, eh? You can bet that the folks who picked programming language names like "python" and "Java" are kicking themselves after seeing how "RS-274-NGC" rolls off the tongue.

Results of my first gcode program on my microCarve A4 CNC.
Results of My First GCode Program.

Well, the EMC2 site has a link for a gcode tutorial, but what's there is...not much. Maybe I'll pitch in, since that's what wikis are for, right? The I went an read the EMC2 documentation, which has the standard cart-before-horse format of discussing details before generalities. Then I found the excellent LumenLabs GCode Tutorial. Much better!

I read some bits, scanned others, then hit the keyboard on my CNC control system. It's an old Athlon 800 with 768MB of RAM loaded up with the EMC2 LiveCD install for Ubuntu Hardy Heron, with EMC2 upgraded to the current version after install.

I fired up EMC2 with the SIM-Axis configuration for developing the gcode. I've got three different configurations of EMC2 on my desktop. I've got the SIM-Axis setup, and two different configurations for my microCarve A4, each with different origins for the axes.

I used gedit to create an initial gcode file, then opened it in EMC2. The gcode preview window is great. Whenever I edited the gcode file and saved, I'd click the reload button in EMC2 and immediately see the changes. Likewise, the error messages were good enough to let me find my problems, though the problems were usually typos rather than what was reported.

I used iterative development, of course. No sense writing too much code before finding out that I didn't understand some element of syntax. I started with initializing the mode settings, lifting the head to a safe traversal height, traversing to a point in space, then returning to machine zero. After fixing a couple of problems, I got what I wanted. Then I added a few additional move commands, and got the simulated CNC to follow them. At that point I could see that things would get out of hand pretty quick if I didn't learn some basic flow control.

So I read up on subroutines in gcode. I laid out a simple key pattern on graph paper, and wrote the necessary routines. That's the border you see in the picture above. That was the easy part. It's all straight lines.

Next was curves. I read up on G2 and G3. I hadn't thought about the ability to shift the depth of cut across the curve when I started reading, but by the time I was done I was thinking, "Hmmm, if I vary the depth of the cut with a V groove bit, I can vary the width of the cut just as I would vary the width of a line with a calligraphy pen."

So I broke out a fresh sheet of graph paper, and started drawing some letters. Well, it took me about three times as long to lay out the letters as it took me to lay out the key pattern, but I managed that. Not only that, but I set things up with scaling factors and variable settings that allow me to easily scale and move the letters.

Results
The results you see above are what I got from the first "live" run of my first gcode program. The cuts are a bit deeper than I'd like, and the 90 degree bit I'm using right now doesn't help. Also, the cutting was a bit fast for the plywood, causing the wood to be frayed on the cross-grain cuts. Still, the varying of "line weight" on the letters turned out well. Overall I'm happy with it, and the defects should be easy to fix when I run it again. I'm planning on building a complete alphabet for this font and throwing it into a file for later use.

Monday, September 29, 2008

Trying Out Free OSes & Giving Back

My high school computer class got a start trying out free OSes on the school's computer systems about two weeks ago. I brought in a stack of CDs and DVDs with free OSes on them like Linux, FreeBSD and OpenSolaris. I passed them out to teams of two or three students apiece and had them start testing them on our different systems.

Almost none of the students had never booted an operating system other than the one a computer came with. Few of them had ever worked on an OS other than Windows, except perhaps for a short session with Mac OS on one of the Macs we have in the lab. Likewise, almost none of them had ever interacted with the system BIOS.

This activity was a success on so many different levels:
  • We had a lot of fun. The classroom was in a state of productive chaos throughout.
  • The distinction between hardware and software was demonstrated. That is, that there is a division between the operating system and the hardware it runs on.
  • Students were able to see and interact with new operating systems.
  • Students discovered that a different operating system was not so alien. They were able to find the applications they wanted and do at least basic tasks.
  • Things don't always work, and that's OK. There are ways of dealing with it.
  • You don't have to rely on any one OS. If one isn't working out for you, try another.
  • Different OSes and different applications give you lots of options for doing things your way.
As a conclusion to the activity, we took the next step--giving back to the community. To start with, we discussed how all these OSes came to be, and why they are free. We reviewed the history of Linux and BSD in particular. How large things have happened as a result of the contributions of many, many people, each giving what they can, from detailed bug reports to programming. Then I let them know that they could make a contribution to the community by reporting their test results.

I had the class post their final results on the Linux Compatibility Database.

Afterward, I did a quick informal survey. About 80% of the class is interested in trying out some of our new OSes on their home systems to use as an alternative to what they have now. The holdouts are apparently not sure what their parents would think. We're going to have a future activity where we duplicate some disks for use in class and for the students to take home.

This activity turned out really well. It accomplished my goals for the class admirably. I wanted to give the students something in the way of a hands-on activity that would really make them feel a sense of control over the computer systems, and break through any remaining shyness. I also wanted to instill a sense of confidence beyond that, confidence that they can handle new programs and ways of doing things. Even experienced users have shown some resistance to trying something new to them at times. Further, I wanted to broaden their horizons a bit.

We've achieved all these things, and the fun that came with the activity has left the class hungry for more. It's a great base from which to continue our learning.

Tuesday, September 16, 2008

Trying Out Free OSes

As a class activity I had my high school students try out several different free operating system distributions on the school's different computers in the lab. I brought seven different Live CDs, and we have five different computer types in the lab. I had the class break up into teams of two or three and go round-robin to the different types of systems and test each OS.

The computers we have include two desktop PCs, one a Dell, the other a generic PC (perhaps assembled by our tech, I'm not sure.) We have some Fujitsu laptops for student use, and two models of iMac. I didn't pull all the system stats today, I'll do that next class on Thursday for when I publish the completed test results.

The partial test results can be found here. This document will be updated with further results as we get them.

Today we tried out iLoog 8.02, Edubuntu 7.10 (the last Edubuntu on a LiveCD--it still frosts me that it takes a second disk under 8.x), Belenix 0.7, Ubuntu 8.04, and OpenSolaris 2008.05. If time permits, we can also test Sabayon 3.4e and Mandriva 2008 Spring. These are on DVD, and only about half the lab's systems can read DVDs, so that limits their value to us.

This turned out to be a good class activity for all the students at different skill levels. Some of the students had no experience trying to boot to some operating system other than one preinstalled on the computer. For them, we got to learn about modifying BIOS settings to change the boot priority on the drives on PCs, and using the Option key to select a boot device on the Macs. I'm sure I'll have to review this during out lecture period in our next class to make sure it sticks.

For others, it was good for them to get a chance to try to boot an OS with no idea of what results they were going to get. Getting to see failures during boot and having a sense of what was going on, why the boot failed, and learning when to give up was informative for them.

I tried to hand out OS disks to the teams based on my assessment of their skill level. I gave the OSes I expected to be more compatible and to have the least problems to those teams that were probably less skilled. This turned out to work well for a couple of the teams, but it didn't matter much otherwise. As it turned out, the skill levels had a greater effect on the speed at which a team could try out an OS and move from system to system than it did on how they could deal with the compatibility problems. Just about all teams needed the same level of oversight. Only one team got caught up in games enough to need a comment.

This testing was not only educational for the students, in some cases giving them their first exposure to an OS other than Windows, it was also very helpful for me. Once we do some more testing I'll be able to see which OSes I should duplicate to use with my middle school class on the basis of what works where, and what apps are installed. For that class, it would be nice to have one OS that works on all lab systems, but splitting between two different ones with the same apps would work, as well.

I'll also be offering copies of the disks to students when we complete this activity. In fact, I'll probably make an activity of copying the disks on those systems that can do so. Then the students can try the same thing on their home systems.

Wednesday, August 27, 2008

Ubuntu Works!

Classes start next week. Between meetings today I had time to slip into the computer lab and spin up the Ubuntu Live CD for Hardy Heron. It works great with the PCs in the school's lab. Networking works fine, video is good, apps run, and so on.

I'm really pleased about this, even if it's not an Edubuntu Live CD as I've mentioned elsewhere. I'll be able to use this disk in class even if it doesn't include some of the Edubuntu apps I'd like to have. I'm not in a position to intall Ubuntu so that I can apply the Edubuntu stuff to it. I need everything on one Live CD.

In addition, I may download the older v.7 distro of Edubuntu Live CD image to get the Edubuntu apps. After all, that's what I was told I was getting when I ordered the CD. The upgrade to Hardy Heron where Edubuntu no longer fits on the same CD as the core Ubuntu is what I got. The fact that Hardy Heron works fine on the school's hardware means that it's reasonable to hope the prior version will do the same (we're not talking cutting edge hardware here.)

I was hoping to spin up some other free OS Live CDs, but my time went into making a copy of Ubuntu for a former student who happened to be in the school center instead. She has a computer at home that's a close cousin to our lab PCs that has some problems that a switch to Linux might fix.

The joy of free software. All I had to do is pull out a blank CD and make a copy for her. No fuss, no muss, no shrink-wrap licenses or fees.

And knowing that I've got at least one Live CD that just ups and runs on the school's computers feels really good.

Thursday, June 26, 2008

Trying Out LiveCDs

I'm still looking for a Linux or FreeBSD LiveCD for my high school computer class. I've had some success, but there's more to do. The idea is to have an OS disk that we'd run off of during class with the tools we need all there, ready to go.

No Room, No Room!

The computers in our lab at school have very limited disk space. Windows XP is installed on them, along with programs needed for other classes. That doesn't leave much room for graphics programs and development systems that I'd like to use for my classes, both the middle school and high school classes. So I thought I could use LiveCDs to solve the problem. Only the student's data files would need to be stored on the hard disk.

Belenix

I downloaded the Belenix 0.7 LiveCD the other day, because the list of packages appeared to suggest that it includes the JDK, though actually it doesn't--it was my mistake. I got impatient with scrolling through the unalphabetized list and used my browser's "find in page" function. What I missed is that there was a break in the list, between items that are on the LiveCD and packages that aren't there, but are available. So, after downloading the .iso file, burning it onto a disk, and spelunking around I find it isn't there.

OpenSolaris

I also took another look at my OpenSolaris 2008.05 disk to make sure it wasn't there. I'd seen a comment on somebody's website somewhere during my search for a LiveCD with the JDK on it (I've lost the link, I'm afraid--it was someone who used to have their own LiveCD distro with JDK that gave it up with a comment along the lines of "I'm just going with OpenSolaris, it has the JDK.")

iloog

In spite of my slip-up on Belenix, I managed to locate a LiveCD that does have the JDK on it. In fact, it's got a heap of development tools on it. It's the iloog 8.02 LiveCD. It's based on Gentoo and while it doesn't want to start up X on my newer iMac, it runs well on my older MacBook. It does very well on the MacBook, so far as I've tested it (only briefly, so far. I spent too much time on OpenSolaris and Belenix today.) It supports the MacBook's built-in mousepad (unlike OpenSolaris and Belenix) and it actually boots quickly compared to the two OpenSolaris distros. I haven't tried any networking yet, but everything else seems to function.

System Dependencies and Finally, Sabayon

Given that the Macs in the school lab are about the same age as my MacBook (Core Duos) I'm hoping that it runs as well on them. If I'm lucky I'll have a chance to try it out this week, if not, then next week. The PCs in the school lab are pretty plain-vanilla hardware-wise, so I'm less concerned about them. (If the PCs in the lab had DVD readers in them, I'd probably have just gone with Sabayon's LiveDVD and called it a day a couple of months ago.)

Tailor Made?

There's still the open possibility of making up a custom LiveCD for the class. My time is such that I don't have much opportunity for a lot of tweaking, though. Installing some apps is within reason, dinking with hardware detection and drivers is getting deeper into things than I can afford. I'll probably look at Morphix in this regard.

However, at this point iloog is looking promising. Beside Java, it has Common Lisp on board, as well as ruby and python. I think I also saw Haskell when I was poking around. There's no way I could properly cover all these in class, but if time permits I could potentially spend a class (90 minutes) on each of 2 or 3 of the non-Java languages before we start Java.

Not Groovy, Dude.

Groovy, unfortunately, was not among the represented languages on the iloog disk. As I've mentioned before, the chances of me being able to prepare a full semester of classes on Groovy for this year is pretty slim.

Edited in 2010 to add:
Epilogue

I ended up making very good use of a variety of LiveCDs by having my class try out the different OSes on the school systems. And Java worked out well in spite of its complexities and niggling demands for exactitude. Though a well rounded scripting language would be better. If Sun had ever really sorted out JavaFX, rather than going halfway and stopping, that might have served. But when they write half a language then tell you to fill in the rest with the hooks to Java, that's no solution for a classroom.