Friday, July 03, 2009

Firefox: The progression of popularity and the stigma of the Geek.

Lets take a moment and look back about 5 years at the state of the GNU/Linux desktop from an emerging web based world. There was really only two web browsers worth mentioning, Mozilla and Netscape (which in hindsight were essentially the same thing). The problem? They were heavy set in terms of the resources they required, so what happened? Mozilla released Phoenix, and it was amazingly fast and nobody could believe how quickly it would fire up and run on their old Pentium II machines that they slapped Linux on in an attempt to breath some life back into them.

At the time only those "in the know" were running the browser but it was quickly gaining steam just in time for a name change to Firebird due to angry people with trademark hooks on the name and for a decent amount of users this caused enough confusion for there to be a riff in its general usage but as time progressed and users were aware of the name change things were back to normal. Forums were booming with the merits of the browser as the popularity gained, it was insane how fast your browser could be. It truly raised the bar for expectations of what users compared all other browsers to. Now that we've gotten some happy users, lets go ahead and change the name again. This time the Firebird database people are upset so Mozilla politely obliged and changed the name again. Thus, Firefox is born and the web browser revolution is under way. Firefox hits the ground running with features no one can compete with, it is wildly extendible, is "secure" (I always use that word with a grain of salt), open source, and its fast. This is truly innovation that will go down in the history of computing.

Lets fast forward to today and walk into a room of GNU/Linux aficionados and ask "What's your opinion of Firefox?" and as we make this inquiry let us remember that this was the same demographic that half a decade ago was singing the praises of the now main stream browser. The responses you will receive are probably going to be something along the lines of "I don't use FirefoxOS", " pwns Firefox in the face", or "Bloatware is annoying". What happened? Geeks are fickle creatures, that's what happened. We love the latest and greatest tech that nobody else is using because its new and shiny, its fast, it shows promise, and because nobody else is using it we are somehow elite for doing so. What about when that new shiny tech reaches maturity and succeeds in a big way? Firefox happens.

Here's the reality of the situation, yes webkit is cool as hell from a geek standpoint because its new and its shiny but Firefox is tried and true, it supports all the latest and greatest web tech, is popular as hell, its well supported, stable, "secure" (remember that grain of salt), cross platform, fast, extendible as ever, open source, and it just flat out works. I'm not saying you should turn your nose up at webkit in any way, shape, or form because it truly is the new shiny tech that shows a lot of promise. But I'm tired of people bitching and moaning about Firefox's "issues" when all the arguments I have heard thus far are simply cases of a Geek stigma haunting what is now too mainstream to be "cool" or "l33t" enough for those of us who pride ourselves on our technological prowess.

Lets try to be Geeks and be happy for that which emerges from our depths as a great mainstream success in the user share market.

Thursday, April 02, 2009

Campus Ambassador Presentation: Introduction to Fedora

I recently gave a presentation over Fedora Infrastructure to the Sam Houston Association for Computer Scientists (Sam Houston State University student organization for the CS department) and I had quite a bit of fun with it (though some of the audience seemed a little overwhelmed as they are college students and this is a bit enterprise level for them, but I think it was very important to introduce such a system to them). I appreciate the Fedora Infrastructure in a big way because being a systems administrator is how I pay the bills and I have an incredible appreciation for what the Fedora Infrastructure team does on a daily basis to keep Fedora as a whole working smoothly.

I started out in a large class room with a big white board and a projector hooked up to my Fedora 10 (Xfce Spin) powered laptop. With dry erase marker in hand I began to boggle the student body's minds. Here is a brief overview of what I covered. (It's not entirely brief, but I covered a LOT of material in the hour I spoke so I tried to sum up where I was able in my Ambassador Report)

Key points:
- What is infrastructure? -> Infrastructure in terms of fedora is a series of integrated tools that drive fedora forward and creating an extremely powerful development environment.
- Why does infrastructure matter? -> Infrastructure matters because without it the development cycle would largely be chaotic, with it we can bring procedure and structure.
- Fedora Infrastructure Team has a motto that is posted in the topic line of their irc channel, it is "We run the servers that run Fedora" and this is largely true because without the infrastructure, not much happens. The infrastructure team, just as the development team, is made up of volunteers who are willing to contribute their time towards the greater good of the project as a whole.
- Core Components of Infrastructure -> FedoraHosted, Koji, Bodhi, BugZilla (I felt this deserved inclusion even though its not managed by Fedora Infrastructure team), Fedora Account System, Package Database, Mirror Manager, Smolt, Planet, Fedora People. (I completely forgot fedora-cvs in my slides but there was a white board on the wall that I was drawing how the entire Infrastructure fit together and was able to add it on the fly.)
- What does it mean to me? -> As a developer, contributor, or even just as a user these are the components that are relied upon to keep everything functioning. We need a build system for new packages, we need an update system, we need a bug tracker, we need a place for new packages to be submitted, we need web space for miscellaneous Fedora work including but not limited to the new package review request procedures. This makes it all possible.
- Fedora Hosted -> What would be considered "upstream", this part of the Infrastructure allows developers to host their project with a ticket tracking system, a version control system, and a wiki. Each piece is extremely useful for a collaborative development environment and offers the developers choice in cvs, bzr, svn, git, or hg.
- Fedora CVS -> I know I don't have a slide on this, many apologies to all, I really can't believe I did it, but I did cover it. This is the place where packagers upload packages for inclusion into Fedora, package patches are stored here, and builds are spawned from here.
- Koji -> If you're a packager then this is an element you will get quite cozy with, it provides a build system to submit packages to. Koji offers a web front end that will allow for yourself and others to monitor the status of your build, the logs, obtain the resulting package or source package, also allows for what are called "chain builds" (I went into a quick overview of this on the board), and offers a grounds for the package to be built on multiple architectures in one wonderfully automated swoop.
- Bodhi -> Provides for an update management interface, integrates with bugzilla, will push based on karma, allows for tagging of update type and can recommend reboot for users who use PackageKit. Also provides statistics on updates. This is the system that pushes out to the mirrors.
- Bugzilla -> Place to file a bug against any component of Fedora, it allows for keeping all related parties up to date on current happenings of a bug.
- Fedora Account System -> Where so much magic happens its amazing, this is truly where the integration of the entire system comes to light. When you create a Fedora Account you are able to be granted privileges to any other component of the Infrastructure. The Account system will keep track of user information, group memberships, permissions, security keys, among other useful information.
- Package Database -> This is not only a user searchable database for those on the web, it is also a web based management interface for access to different packages. It ties in information with fedora-cvs, bodhi, koji, and bugzilla as they pertain to the package. This is a wealth of information that I've never experienced in other development environments.
- Mirror Manager -> Package updates are pushed through here, mirrors are literally managed (name kinda implied that one) and it provides a quite impressive management interface to those who want to run a mirror of their own either public or private and fine grained choice of what "branches" of the repositories to host.
- Smolt -> Statistical accumulation of hardware information. I personally think this is quite unique in the sense that anyone can go and check what hardware is popular and from what vendor which I can only imagine to be valuable information to those who develop kernel and system level components of the GNU/Linux platform and most notably for Fedora.
- Planet Fedora -> Aggregate blog posting, great place to get news on what is currently happening in the Fedora world (or planet if you prefer).
- Fedora People -> This is where contributors can post whatever they need in a web accessible location for current work, no matter if it is documentation, art, a package, or other piece of the grand Fedora puzzle. This is the place for it.
- How is it all developed? In an open source environment, by the community, in a collaborative and innovative manner... just as it should be.
- Technologies used to develop the Infrastructure -> Python, TurboGears, Kid, Genshi, SQLAlchemy and Cheetah.

Slides available here.

Tuesday, February 24, 2009

First ever Fedora Ambassador Tech Talk

Gave my first tech talk as a Fedora Ambassador, I presented to the student organization at the University I attend known as "Sam Houston Association for Computer Scientist" (SHACS for short). I wanted to introduce my fellow classmates to the wonders of open source, Linux, and most notably: Fedora. I was lucky enough to have received an Ambassador Kit from my sponsor so that I was able to hand out free media, buttons, stickers, and a couple t-shirts to those in the audience who were already on the Fedora band wagon and were just interested to hear what I had to say. This was a big hit, I thank inode0 for my kit.

Key Points that were covered:

* What is open source?
- Open source is software such that you can download, modify, and redistribute its source code as per the license it has been released under.
- Open source != freeware, open source is not inherently "free as in beer" that is just a common side effect. (Case and point: Red Hat Enterprise Linux)

* What is Linux?
- Linux is a kernel that is coupled with the GNU userspace along with thousands of open source projects to provide a full featured operating system, and in the end is commonly referred to as "Linux" for short hand.
- Linux is currently the largest open source project of its kind and supports more hardware than any operating system in the history of computing (Thanks to Greg K-H for that zinger of a quote)
- Linus Torvalds wrote and released the first version of Linux as a sophomore in college (this is the time to develop and innovate in an open environment, we are the future)

* What is Fedora?
- Fedora is many things, it is a distribution of Linux, it is a community, it is an infrastructure, it is an outlet for ideas to come to life in ways that did not used to be possible.
- Fedora is a place to jump in and get involved in all stretches the Linux and open source world, it is a place to bring your interests, your talents, and your concepts in order to contribute to the greater good.

* Why should you care?
- Fedora means a lot to me because its a project that makes a point to work with upstream open source projects in an attempt to better the open source world as a whole. Its development process reflects this and if/when you get involved you will see this too.
- We are all computer scientists, we are all college students (or professors), and now is the time to get out there and do something with our knowledge, and do it in an open manner.
- Now is the time to truly innovate and do so out in the open (Notice I keep saying this? Hint, Hint).
- We owe it to ourselves, we owe it to our community of developers and users, and we deserve better than the proprietary wares that have been peddled onto us for so many years.

* Who uses Fedora?
- Linus Torvalds, the creator of Linux. Runs Fedora
- IBM Roadrunner, fastest supercomputer on the planet. Runs Fedora
- NASA and the FBI. Run Fedora.

* How to get involved?
- The wiki covers all sorts of documentation on how to get involved. I am planning another talk on how to get involved covering everything from making a Fedora Account to getting a package accepted by Fedora all the way to pushing it out to the repositories through the wonderful infrastructure that is available.

* How to get/give help?
- Referenced the audience to the wiki page on communications, discussed the different roles each mailing list plays as well as irc channels.

Ended with a QA section.

Special thanks to Max Spevack for his slide show that I based mine off of and also for maintaining the statistics, I covered them in my slides and it was nice to have real world numbers to show. The map was also a big hit, graphical goodness is always fun.

EDIT: Forgot to upload the presentation slides, now available here.

Thursday, January 15, 2009

To Twitter or not to Twitter...

So... I haven't posted in ages and its mostly a time constraint, I'm busy all the time but I hope to post here more often in the near future as a do more concurrent programming research for my professor. I have however started to twitter. I always said that I wouldn't, but I did and I promised myself it would only be for things that don't suck (mainly technical posts). The only reason I did get a twitter account is because I have a T-Mobile G1 with the almighty Android OS and there's a twitter client in the Market. So I can quickly post from where ever, when ever, while I'm doing what ever. Which makes it nice. Hope to post half decent cognitive thoughts in the near future, laters.

Monday, January 28, 2008

Quote to go with current events...

I haven't posted in a while, and this technically isn't really a post but a quote that I had to slap up somewhere because I feel it is all too fitting for what is happening in the world around me.

"Naturally the common people don't want war... but after all it is the leaders of a country who determine the policy, and it is always a simple matter to drag the people along, whether it is a democracy, or a fascist dictatorship, or a parliament, or a communist dictatorship.Voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same in every country."
-- Hermann Goering, Nazi and war criminal, 1883-1946

Thursday, September 06, 2007

A Simple Lexical Analyzer .... Enjoy.

I got bored and wrote a simple lexical analyzer and thought about walking through explaining it but in all honesty its rather self explanatory especially with the inclusion of the diagrams I made and the BNF grammar I am supplying to go with the code snippet.

The grammar simply parses words or phrases in a file with white space as a delimiter and requiring that all words or phrases start with a letter and are followed by any combination of letters and digits.

BNF:


G[<Word>]

<Word> ::= <Letter> | <Letter> <LetterDigit>
<LetterDigit> ::= <Letter> | <Digit> | <LetterDigit> <Letter> | <LetterDigit> <Digit>
<Letter> ::= a | b | c | d | e | f | g | h | i | j | k | l | m | n | o | p | q | r | s | t | u | v | w | x | y | z
<Digit> ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9

A graphical representation of the grammar:


And the finite state automaton:

Finally, the code ... in python of course :)

Note: If you copy and paste this code it will not work because for one reason or another I can't get blogger to format the tabs correctly, please download the code from the link provided at the bottom.


UPDATE: Now featuring code indentation goodness.

###############################

#!/usr/bin/python
import sys

class SimpleLex:
"""@Author: Adam Miller - Simple lexical anyzer"""

#allowed alphabetic characters in grammar
alpha = 'abcdefghijklmnopqrstuvwxyz'

#allowed digits in grammar
digit = '0123456789'

#white space symbolic representation
wSpace = ' \n \t'

#grammar table of relations
table = [
[1,3,0],
[1,1,2],
[2,2,2],
[3,3,3],
]

#relations of type and their state index
relations = {
"alpha":0,
"digit":1,
"wSpace":2
}


def __init__(self, file_name):
"""open a file, extract contents, close file,
initialize character list, and state variable"""
self.f = open(file_name, 'r')
self.lines = self.f.readlines()
self.f.close()
self.chars = []
self.state = 0

def scan(self):
"""process the file's contents one character at a time"""
for line in self.lines:
for c in line:
# 3 is the error state
if self.state == 3:
print "Error"
self.state = 0
self.chars = []
# 2 is the success state
elif self.state == 2:
print ''.join(self.chars)
self.state = 0
self.chars = []

#let table drive state transitions
self.state = self.table[self.state][self.lex(c)]

"""print final character buffer because the loop will end
execution and not let the last success state be checked"""
print ''.join(self.chars)

def lex(self, c):
"""return the correct relational index to the type of c"""
self.chars.append(c)
if c in self.alpha:
return self.relations["alpha"]
elif c in self.digit:
return self.relations["digit"]
elif c in self.wSpace:
return self.relations["wSpace"]
else:
return 0

#use the code i just wrote
if len(sys.argv) < 2 or len(sys.argv) > 2:
sys.stderr.write("Usage: sampleLex \n")
elif len(sys.argv) == 2:
lexer = SimpleLex(sys.argv[1])
lexer.scan()
###############################

For your viewing pleasure, a small example of the "little lexer" in action:


###############################
adam@pseudogen:~$ ./simpleLex.py lexFile.txt
hello
world
from
my
ub3r
l33t
lexical
analyzer

###############################

The code, the diagrams and the file used in this example are all available here.

Wednesday, September 05, 2007

Computer Science: Through the eye of those on the other side of the looking glass

So I sit there in front of my faithful computer hacking away at what I am pretending to be homework so that I think I am actually being scholastically productive. As I sit, I receive and instant message from a good friend of mine who is a Computer Science Major at Rice university and he has a question about some Mac software's auto backup capabilities, and I answer to the best of my knowledge and the conversation comes to a halt. I simply assume he is troubleshooting something for someone and more likely than not it is for his girlfriend for she is the proud owner of an Intel MacBook, but never the less I go back to my procrastination like a good student.

A few more minutes pass by (could have been over an hour for all I know, procrastination isn't really the best time keeping activity) and I get an instant message from my friend's girlfriend stating that she has been given the task of writing an essay about "What Computer Science is" for her introductory computer science course at Rice (we will assume for all practical purposes that this course is for non-comp sci majors). And she exclaimed to me about the daunting tale that was writing said essay along with the irony that went with the fact that the word processing software she was using had crashed and she lost everything, but being the good student she is she trudged on and rewrote it from scratch. Though, this time, with a different attitude and a delightful spin on it all. Her essay was far too priceless not to publish somewhere and I got to it first! So, without further a due I present to you "What is Computer Science?" by Rachel Gittleman:

" Earlier this evening I wrote a short essay on the definition of computer science, a discipline I am familiar with only through the influence of a computer scientist I have been dating for nearly three years. I did not let him read it because I was embarrassed by how little I actually knew about the subject, and as I was in the process of saving and finally being done with the humiliating exercise, my word processor crashed and took my hard-won essay along with it. Even with the help of said computer scientist boy-friend, the essay proved to be irrecoverable.

So what is computer science? A couple of hours ago my answer was as optimistic and technical as possible for someone who really has no idea what she is talking about. I stressed the dichotomy of the discipline as both a study of computation and computational machines--theory and practical programming, math and engineering. I mentioned the relative newness of the discipline compared to others in academia (I even dropped famous names), and conjectured that views on computer science must be very different now than a mere decades ago, but that at its essence, the field is about what programmable machines can do and how to make them do it. Having my word processor crash was disheartening and, after fruitlessly trying to recover the document, I was convinced to rewrite the essay in its current form.

Computer science is new, rapidly evolving, and incredibly broad. It has the power to drastically improve the quality of our lives over incredibly short spans of time, and in the past few decades it has provided us with earth-shatteringly new tools that quickly have become integral parts of the daily routines of even the most computer- illiterate (myself sadly counted among them). Although these qualities make computer science fascinatingly current and applicable in a way that the older sciences and humanities are not, there are also inherent downsides, particularly when it comes to accountability. The constant demand for new programs and applications and the inability of most of the public (again, including myself) to understand what goes into making them naturally results in buggy programs that can crash without notice, taking hard-working students’ single-spaced essays on computer science with them. I think the public would do much more than sigh and send an error report if a mechanical engineer built something so easily broken.

I still think computer science is the study of computation and computational machines, and I still think that it is an exciting new discipline that I want to know more about (that is why I am taking an introductory course after all), but I also want to make sure to express my hope that one day all that theory and engineering and programming will eventually be accountable for making me a word processor that will not crash."

Wednesday, August 15, 2007

Why Kernels don't matter:

In the vast world of UNIX-like environments we are confronted with users who will make a statement such as "I run Linux" and honestly think that is the all encompassing term for the environment they know and love but this ultimately bothers me for the simple fact that kernels no longer matter and what these users need to understand is the its GNU that needs to receive the credit. Now please, before you spam my inbox with "Power to Tux" emails just hear me out.

Lets take into account the only thing that 95% of users are only concerned with three aspects of a kernel:
1) does it support my hardware?
2) is it secure?
3) is it stable?
And by no means in that order but lets just address them in the order listed.

Does it support my hardware? Well that all depends on which kernel you are looking at but in reality of the current state of mainstream kernel development, it is more likely than not that all of your hardware is fully supported with exception only to wireless chipsets simply because the whimsical world of open source wifi is a cruel and unusual whore. Also, a notable point about hardware is simply that if you just purchased the latest and greatest hardware it is more likely than not that the current development version of the kernel of your choice supports it even if the latest stable release does not and yes this isn't wonderful news, but good things come to those who wait. One last thing to mention about hardware support is that in the event that one open source kernel supports hardware, all the rest (mainstream) do as well.

Is it secure? In short, Yes. Every mainstream and widely accepted system kernel today has a level of security that is acceptable enough to at least a large group of followers. Does this mean "Install, run, secure"? No, of course not. A kernel can only be so secure by default and still be deemed usable. This will raise the question, "but how do I make it secure?" and the answer is ancient and often used to torment new users but simply: "Read the Manual" and this is not meant to be clever, snide, or coy but is sincere in the fact that contributors spend hours documenting software so please do not allow their efforts to go in vain.

Is it stable? That is honestly a question that I can not answer for you because the definition of the word 'stable' in the software universe appears to be translated very loosely. Is it "Microsoft stable?" Yes, and then some. I will be willing to bet that any *n?x style kernel you use will be more stable than anything The Blue Empire will wrap up and try to sell you. Is it "debian GNU stable?" That is more likely not to be the case, but at the same time many people find the debian definition of "stable" to be far too strict (I find it to be perfect, but its all a matter of opinion).

Now that all three main concerns have been addressed where to go from here? Well, how about clearing up how kernel's don't matter? OK.

When you boot into your "Linux" installation you are actually booting into "GNU/Linux" which is in fact just a port of the GNU userspace to the Linux kernel. "Is there any other port?" Yes, many. The GNU/OpenSolaris port, the debian GNU/kfreebsd port, the debian GNU/Hurd port, and the debian GNU/NetBSD port are all prime examples. These are quite possibly not all the examples, but my knowledge of the debian GNU world is more intimate than my knowledge of other GNU communities/project. If the average user was to sit down to any of these kernel ports of the GNU userspace they would more likely than not know the difference simply because "the kernel does not matter."

Now, allow me to retort against myself and attempt to offer food for thought. Kernels do matter, it matters very much so if your kernel is capable of many things including POSIX compliance for semaphore capabilities, pthreads, and other important features that are necessary for porting applications. Your kernel also matters in the sense of resource management, if it has a sufficiently fast network stack (and more importantly is that something you are worried about), if it has an efficient process scheduler, if it can multitask without excessive overhead, if it can allocate memory in a timely manner and continue to manage memory efficiently, etc... There are many aspects of a kernel that must be taken into account in selecting one that is "for you", but again .... these things do not matter to the average user and thus "Kernel's don't matter."

This has been yet another random babbling brought to you in part by a very bored 'Me' ... hope you enjoyed.

Wednesday, April 11, 2007

debian etch stable release

debian GNU/Linux has officially released their version 4.0r0 code named "etch" ... I am a few days late on posting, but basically the results of my installing it are this:

debian came, debian dominated, enough said.

..... my home desktop will once again be running debian for the foreseeable future.

Monday, April 09, 2007

Windows 2000 "mock source code"

/* Source Code Windows 2000 */

#include "win31.h"
#include "win95.h"
#include "win98.h"
#include "workst~1.h"
#include "evenmore.h"
#include "oldstuff.h"
#include "billrulz.h"
#include "monopoly.h"
#include "backdoor.h"
#define INSTALL = HARD

char make_prog_look_big(16000000);
void main()
{
while(!CRASHED)
{
display_copyright_message();
display_bill_rules_message();
do_nothing_loop();

if (first_time_installation)
{
make_100_megabyte_swapfile();
do_nothing_loop();
totally_screw_up_HPFS_file_system();
search_and_destroy_the_rest_of-OS2();
make_futile_attempt_to_damage_Linux();
disable_Netscape();
disable_RealPlayer();
disable_Lotus_Products();
hang_system();
} //if
write_something(anything);
display_copyright_message();
do_nothing_loop();
do_some_stuff();

if (still_not_crashed)
{
display_copyright_message();
do_nothing_loop();
basically_run_windows_31();
do_nothing_loop();
} // if
} //while

if (detect_cache())
disable_cache();

if (fast_cpu())
{
set_wait_states(lots);
set_mouse(speed,very_slow);
set_mouse(action,jumpy);
set_mouse(reaction,sometimes);
} //if

/* printf("Welcome to Windows 3.1"); */
/* printf("Welcome to Windows 3.11"); */
/* printf("Welcome to Windows 95"); */
/* printf("Welcome to Windows NT 3.0"); */
/* printf("Welcome to Windows 98"); */
/* printf("Welcome to Windows NT 4.0"); */
printf("Welcome to Windows 2000");

if (system_ok())
crash(to_dos_prompt)
else
system_memory = open("a:\swp0001.swp",O_CREATE);

while(something)
{
sleep(5);
get_user_input();
sleep(5);
act_on_user_input();
sleep(5);
} // while
create_general_protection_fault();

} // main

/* I saw this posted on the ubuntuforums and thought it was too funny not to post here as well.... enjoy :) */

Thursday, March 15, 2007

The case of the missing switch statement

For all my C/C++ and Java programmers out there who have some to know and love the reserved word switch would find it rather interesting to venture in the direction of a language that lacks such a statement, just as I had. I have recently divulged into the realm of what I will call "open source at its finest" by learning the Python programming language. Python is a very verbose, flexible, powerful, and object oriented interpreted programming language (details, if you want/need them, here) that I have grown to love over the past month in my adventures of learning its ways, but I recently stumbled across something that I originally thought to be an oddity: the lack of a "switch" or a "case" statement. I later discussed it with a friend of mine who simply said,"why do you need a switch statement?" and I really couldn't find a solid answer. He later went on to explain how a switch statement is simply a way to make a complex/nested if statement faster because when it compiles there is just a static jmp (for those of us sadly stuck on a x86 machine) statement in the assembly to where it needs to be instead of multiple compare operations. Thus, since python is interpreted, it would never reap the benefits of this optimization.

Though, if you are simply stuck on switch statements you can take the following Java code:

int x;
//read in a value in some form or fashion and assign it to x
switch (x) {
case 1: this.someFunction(); break;
case 2: this.someOtherFunction(); break;
}

To the following python code (Note: the python code is utilizing the power of the "dictionary" data type that is part of the language)

self.x = #read in a value in some form or fashion to assign to x
mySwitch = {
1: self.someFunction,
2: self.someOtherFunction
}
callFunct = mySwitch.get(x)
callFunct();

#Neither of these code examples have been compiled or run respectively, just coded off the top of my head ... so if there is a slight syntax error, sorry :)

So... it can "be done" but what advantage does that code have over this python code?:

if x == 1
self.someFunction()
elif x==2
self.someOtherFunction()

Well, that honestly depends on who you ask and in what respect you are asking it but for all practical purposes there really isn't an advantage or disadvantage either way its just mainly stylistic preference.

Why did I write this?
Well, I am bored ... it is spring break and this was one of the best ways I could think of to kill time and procrastinate from actually writing the lexical analyzer and parser for my compiler theory class. And yes, I am writing a compiler in python ... why? because I think it will be funny.

/me


Friday, March 09, 2007

The linux merit badge

When I started on linux, it was the black magic of the computing world. Novell hadn't bought SuSE, HP wasn't writing drivers, Dell hadn't honored its existence, and pretty much the only company actually doing anything with it was RedHat. Back in the days where Gnome 1.x and KDE 1.x reigned supreme, blackdown was the only way to get Java functional without heavy hacking, the 2.6 kernel had just recently gone stable and so many users were scared to upgrade, x86_64 wasn't even a publicly released concept on the hardware end of the spectrum much less the software world, PowerPC was still going strong over at the Apple Camp, WindowsXP was much anticipated by the Microsoft huggers, and when you said "I run linux" people automatically assumed you knew what you were talking about, and at the time there was a 95% chance that you really did.

Fast forward to today: Vista is released, Novell owns SuSE, RedHat offers Certifications, Canonical rushed in and took over the linux desktop market with throwing millions of USD at their Ubuntu distribution, HP writes native linux drivers for their printers and random peripherals, Dell is now offering linux on PCs, specialty companies are all over the place offering linux centric services and hardware, and any noob with a computer that is set to boot from cd-rom can run and, in most cases, install linux on their computer. Is this a good thing? Well, yes and no.

Yes:
More users means more support, which is a very good thing.

No:
I am annoyed with people who won't read documentation to learn things on their own, this up and coming generation of linux users wants to be spoon fed everything. When I started I was taught how to use man pages and learned that google was my best friend.

I recently started as a TA for one of my professors teaching an Abstract Data Types and programming algorithms class in Java at my University and there are two, only two, linux users in the class of roughly 25. That doesn't bother me so much, linux isn't for everyone and it still has an "under dog" aura about it, but when I started speaking to these linux using students (who run Ubuntu) I quickly realized they know nothing about linux. They don't realize that Gnome != "a version of linux", they were lost as soon as I opened a terminal window, and weren't familiar with even the trivial task of checking their screen resolution. I asked a few questions regarding the Operating System and simply got the reply,"I dunno, when I installed it just worked" and I thought to myself "That is horrible" but as the day went on I truly thought it through and realized that this is a mile stone for desktop linux. Yes, they are under informed but ask your grandmother what the difference between explorer.exe and iexplorer.exe on a Windows machine and you will receive a blank stare. What has happened is that the linux merit batch, as I like to call it, no longer certifies your knowledge of linux but simply your endorsement of the open source movement by being open minded enough to use something different and see what the "under dog" has to offer. Your level of involvement and further reputation you have earned along the way will prove your skill level.

Conclusion:
The linux merit badge has lost a little power behind its punch but at the cost of betterment for the movement as a whole. The GNU/Linux world stands to gain a lot from the fact that your "average Joe" can use it as a desktop system without flaws. Will linux ever take over the desktop market? I don't know, nobody can really know, but I think we are at least stepping in the right direction of strengthening our user base in numbers and as time goes on and curiosity is sparked I think the next generation of linux users will educate themselves out of desire, not necessity. So basically, the "Yes" outweighs the "No" and I think the "No" will work itself out with time.

/me

Saturday, February 10, 2007

Ubuntu: crash, burn, or burst into flames....

Its been a while since I have posted, and to anyone who reads this I am sorry. The semester is full swing and I am taking a compiler theory class that appears to be rather time consuming, but enough about me....

As everyone knows, Ubuntu has teamed up with Linspire as described here and Linspire's CNR will be ported to Ubuntu's upcoming Feisty Fawn release.

What is my take on this?
Well, we can look at it from multiple points of view. At first glance I don't like the idea basically because of my old debian habits and ideals but if I take a moment to truly think about exactly what this will do for new linux users and general desktop users who don't want to bother with the command line, or the annoying complexity of synaptic, I can't seem to find myself to completely turn my back on the integration. Also, in the event that Ubuntu is given any say so in the development cycle of CNR I could see them taking it to a productive place that it has never reached. Ubuntu obviously has some brilliant developers on their team, staff or volunteer alike, and I would like to see where they could take the whole CNR idea.

I know, short and sweet ... but I had a sour taste in my mouth about it in the beginning but after some mental exercise about the topic I think it will turn out to be a positive addition to the Ubuntu world so there isn't a whole lot more to say.

/me

Wednesday, January 10, 2007

Fun quote

Windows XP - The 64-bit wannabe with a 32-bit graphics interface for 16-bit extensions to a 8-bit patch on a 4-bit operating system designed to run on a 2-bit processor by a company that can't stand 1-bit of competition

Sunday, December 31, 2006

Interesting link...

I doubt anyone reads my blog, but just in case you do. This link is too cool to miss.

http://www.100mb.nl/

Sunday, November 19, 2006

C Follies:

So here I set the stage for what can only be called a "Geek Moment." I find myself sitting at work like a good student employee, but like every other Sys Tech on the planet with nothing to repair, I look to the Internet for entertainment and there I find the all mighty instant messenger. On this interesting communication tool, I converse with a very good friend of mine who will just be referred to as Derr in order to reserve some level of confidentiality. Derr and I, both computer science majors but at different Universities, often find ourselves talking about computer related topics and on this day we spoke about the C programming language and its follies. The main issue with C's follies is that they are not so much a problem with the programming language, its the fact that the average programmer lacks the understanding of what is actually happening behind the scenes and have been spoiled with languages like Java that will raise a compiler error when trying to do something that could later be viewed as "stupid" by the system.

Allow me to begin with a simple code segment:

#include

int main(int argc, char** argv){
printf("%d\n%d\n", 100==true, 1==true);
int x = 100;
if(x)
printf("true\n");
else
printf("false\n");
}

Here the standard bool type has been included and thus the macros for true and false are able to be used, as C defines it: false = 0 and true = anything not 0. And thus the statement "100==true" should evaluate to true (or as the system defaults, a 1) as well as the statement "1==true" then we go into the if statement where if(x) is translated "if x is true, then do ." Which should print out the string value "true" or so most would think from their first look at the code, but in fact that is not how C evaluates this expression.

So what happens when this code is compiled and ran? (compiled with "gcc testbool.c -o testbool")

max@iPseudogen:~/cTheory$ ./testbool
0
1
true

Why does this happen? I'm not entirely sure, but I assume its because the logical operator == does not logically compare in the same manor, or abide by the same rules, as an if statement comparison. I would also assume this could make for some interesting program run time characteristics if one was to use a bool returning function and expect C to follow its own rules about the bool type as stated above.

Next, something slightly more complicated and brings up the point of how a programmer who does not understand the inner workings of a system could fall into error by bad programming technique:

int *x;

void f(){
int y = 1;
x = &y;
}

void g(){
int y = 400;
}

void main(){
f();
printf("%d \n", *x);
g();
printf("%d \n", *x);
}

What happens when this is compiled and run? (" gcc stackframe.c -o stackframe.c")

max@iPseudogen:~/cTheory$ ./stackframe.c
1
400

Now, why would this give different output if the printf statements are verbatim?
When the function f() is called, the system puts its function call on the "call stack" and since the local int variable y is created within the function, its memory allocation is also performed in the stack space. Then x, an int pointer, is assigned the memory location of the local variable y the programmer has now saved the address to a position in the stack space. Why is this bad? Well, it is demonstrated in this example. When the next function call is made, the first function is now finished and popped off the stack, the memory space on the call stack is deallocated, in order to be used later by another function and the system does just that: it uses that space for the next function and (low and behold), it defines a local variable as well and places it in the memory location that the previous locally defined variable was held in. Thus, we have demonstrated the ability to alter a global variable by defining a local variable and assigning it a value. Why is this bad? Well, go write some large scale project and have this be one of the errors in it, debugging would be fun don't you think?

I must make a disclaimer that Derr was the one who came up with the code examples and all credit must be given to him, so uhmm... yeah, that was the disclaimer.

Are there more little interesting things like this? Of course, I'm sure there are C follies that Derr and I have never heard of, but these were the ones discussed on that day and found it interesting enough to blog about while I sit here on my couch with my Xubuntu powered iBookG4 bored out of my mind. Now are there tools out there to catch things like this in C code? Yeah, probably ... just thought this was good food for thought.

-Adam

Tuesday, August 29, 2006

Top down approach to programming

Here we sit in a world of object oriented everything and procedural programmers are considered elitist or engineers, and here is my opinion on what is needed for an efficient programmer today:

Beginnings:
Everyone needs to start somewhere, but where? Well this is, has, and always will be a matter of opinion but from what I have read and experienced through interactions with a lot of other programmers I believe Java is the appropriate place to begin. Why? That's a good question and I believe it is because the basic syntax for assigning values to variables of primitive or "built in" data types is easily translated to other languages with very little effort along with translation of basic operations: if statements, switch statements, for loops, while loops, etc. Also if this language is taught properly the essence and power of object orientation will be in tact for the pupil to go on to learning polymorphism, inheritance, design patterns and/or good programming techniques.

Higher Education:
The second language I think someone should learn will be C++ because we are able to handle actual pointers, deeper concepts of reference vs. value passing, true system ram can be manipulated while still being able to create objects, perform simplified basic I/o and translation from imported libraries to included headers (also known as libraries) is not difficult when moving from Java.
Then on to the mother of languages known by elitist as the god send: C. I think C is essential for programmers to learn in order to understand what is actually happening behind a method call in Java like the popular whatever.toString(); Along with many other applications of how important and efficient pre-processor declarations can be, etc.
Finally, assembly language. Yes it makes you want to run and hide, it might even make you cry but it is required in my world. If you aren't at least taught the theory of gate logic and have a firm understanding of how much work is actually done in a stack with registers and memory addresses then you fail at life. Without the appreciation for what the compiler does for you so your programming language of choice is not so intolerable to code in that you rip your hair out more often then you already to, then your path in the computer science world has been lead astray and once again, you fail at life.

Real World Applications:
Now that the data structures have been thought, the pointers are understood, type casting is second nature, reference and values are no longer something that have to be thought about because you can differentiate the two as though it were child's play, recursion makes more sense than the English language, and you can tell me how your processor is actually representing floating point numbers.... You are now ready to practice what I like to call "real world programming languages" like Python, Perl, Php, Ruby, C#, VB.net, etc. Yes, we have graduated from the "Top Down Approach" and can now appreciate all the hard work our predecessors put into these robust languages. Why? Well because you understand what is actually happening when you pass python the parameter myString = "hello"; and can further appreciate the power in doing so. Now just because I have deemed these as "real world" doesn't mean that the others aren't used in the real world it just means I don't believe they should be used as teaching tools but more as industry level languages that should only be used where applicable.

Is it perfect?
No. Nothing is perfect and this is simply a blog and thus an opinion, my opinion on how I think computer science, in respect to programming languages, should be taught.


... that's today's two cents, take them for what they are worth.

Thursday, July 27, 2006

Commercial Linux Software .... What it needs to succeed

I've been reading a lot of technical articles at popular tech sites about software vendors writing a Linux version of their software. Great!!!

This means a few things:
1) Linux is being recognized by the industry as an entity that must be acknowledged because we are not going anywhere.
2) The suits are becoming aware of Linux in general and feel that there is a possibility of a market of profitability within the Linux world, which is something I think will contribute in a positive way. (Novell has proven that fact)
3) Users who were once upset because their favorite/most used application on other platforms is either on its way to the Linux world or is already available and this will drive the user base in a positive fashion.

What needs to be done in my opinion:
I think companies looking to create and distribute Linux software need atleast 5 developers on hand that run only Linux and each run a different distro. Each of these developers needs to be using one of the most commonly used distros in order to distribute packages that will play nice with the systems they target.

You need a developer running one of each: RHEL/CentOS/Fedora (I don't care which), SuSE, Debian, Ubuntu, and Slackware.

Why?
Because each of these operating systems account for approximately 90% of the Linux world through child/derived distros, etc. Each of these developers should be working together to write or port the software and then package it for the distro they are working on. That way when I go out and purchase this incredible piece of software for my home machine that runs Xubuntu or for my server that runs debian; I am able to run into work to my SuSE workstation and also have the convenience of the same software. Also, I am able to spread the word and nobody is left out (because even Gentoo users can install tar.gz if they must).

Is it full-proof?
Probably not, but I think that most of the community (if not all) that is willing to pay for quality software will agree that this is a wonderful idea and would make the acceptance of commercial software into the Linux world a much more fluent process.

... That's my piece, late.

Monday, July 24, 2006

Why I don't like MySpace....

Here is the problem with MySpace: They are getting too popular on the web.

What happens to things that get too popular on the web and aren't stable enough to back it up?
They get bought out by Microsoft.

What will happen if MySpace gets bought out by Microsoft?
Microsoft will have an extremely popular wide open community that will become their favorite e-Advertising base and there isn't anything anyone could do about it.

Then What?
Well Microsoft would start to integrate MSN and MySpace services, your .NET password would work across all services seamlessly, now including MySpace, and thus all users of MySpace would be required to have an MSN account and vice-versa. There would no longer be a web mail login, it would simply be a control panel to access, modify, and update all of your current .NET enabled services. All of this playing into the monopolistic nature and then once again regaining social web dominance in favor of the evil blue empire. Then, before you know it there is an applet reserving real estate on the up and coming Vista applet bar and Microsoft is tracking every aspect of your web life thus sparking messages from the applet bar like so "I noticed you haven't blogged on Friendster in a couple days, nor have you added any photos to your MySpace, and your MSN mail box is filling up at an alarming rate. Wait, your coffee cup is still half full and you have only drank two cups this morning, shall I call a doctor via VoIP or should I just have Starbucks deliver another Capuccino?"

I am done writing about this and if you don't get the point yet, you never will....

..... Long live Tux.

Thursday, July 20, 2006

The state of the GNU/Linux desktop, *buntu on the right track

Over the past year I have watched Ubuntu and its partner iterations spring from "that thing Canonical is doing" to the most widely used distro on the planet.

Do I mind?
Yes and No, No and Yes. At first I was greatly upset at how much credit and praise Ubuntu was getting for all of the debian community's hard work (mainly because my heart belongs to debian), but now we are in a time of playing nice and each organization progressing together with one another to create a development process that will benefit both projects as equally as possible. (Sources on this found here and here) But now that everything has been sorted out I am one happy camper and now consider myself a "debuntu" user because my server still strives on debian's incredible stability and security but my desktop reaps the ease of use benefits of the *buntu world. I say *buntu because I am an advocate of all Ubuntu flavors because each one offers all the great features as the last but along with the specified desktop environment that fits the target user the best.

What do I run?
Xubuntu. It gives me everything I ever wanted out of a desktop computer for personal, school, and work purposes and it does it all faster. The first time I mentioned to a friend that I ran Xfce on my new machine their reaction was a tad in the "shocked" state because they were under the impression it was a sub-par desktop environment in feature set and only existed for the interest of older hardware. I let him try Xfce for himself and quickly realized how far it has come and how fast it is. While I understand that Thunar is still under development and has some features that the user community would like to see put in it, even in its unfinished state it holds the crown of file managers in my book. So for me, Xubuntu is without doubt "for the win."

Have I tried all flavors of *buntu?
Sure have. Do I like them all? Of course, each one brings to the world the power of debian with the ideals of what the Ubuntu community sees as needs for the desktop along with special configurations for different desktop environment.

Is Xubuntu(Xfce) for everyone?
No, of course not. There are multiple choices because everyone has a different idea of how they want their desktop to interact with them, I just like things simple and fast. Once it is all said and done, its all about personal preference.

Conclusion:
For personal computing, I think *buntu is where the future lies. Yet I like to consider myself a realist and I must say that I believe Novell/SuSE is where corporate Linux is headed in the direction of.