I doubt anyone reads my blog, but just in case you do. This link is too cool to miss.
http://www.100mb.nl/
Sunday, December 31, 2006
Sunday, November 19, 2006
C Follies:
So here I set the stage for what can only be called a "Geek Moment." I find myself sitting at work like a good student employee, but like every other Sys Tech on the planet with nothing to repair, I look to the Internet for entertainment and there I find the all mighty instant messenger. On this interesting communication tool, I converse with a very good friend of mine who will just be referred to as Derr in order to reserve some level of confidentiality. Derr and I, both computer science majors but at different Universities, often find ourselves talking about computer related topics and on this day we spoke about the C programming language and its follies. The main issue with C's follies is that they are not so much a problem with the programming language, its the fact that the average programmer lacks the understanding of what is actually happening behind the scenes and have been spoiled with languages like Java that will raise a compiler error when trying to do something that could later be viewed as "stupid" by the system.
Allow me to begin with a simple code segment:
Here the standard bool type has been included and thus the macros for true and false are able to be used, as C defines it: false = 0 and true = anything not 0. And thus the statement "100==true" should evaluate to true (or as the system defaults, a 1) as well as the statement "1==true" then we go into the if statement where if(x) is translated "if x is true, then do." Which should print out the string value "true" or so most would think from their first look at the code, but in fact that is not how C evaluates this expression.
So what happens when this code is compiled and ran? (compiled with "gcc testbool.c -o testbool")
Why does this happen? I'm not entirely sure, but I assume its because the logical operator == does not logically compare in the same manor, or abide by the same rules, as an if statement comparison. I would also assume this could make for some interesting program run time characteristics if one was to use a bool returning function and expect C to follow its own rules about the bool type as stated above.
Next, something slightly more complicated and brings up the point of how a programmer who does not understand the inner workings of a system could fall into error by bad programming technique:
What happens when this is compiled and run? (" gcc stackframe.c -o stackframe.c")
Now, why would this give different output if the printf statements are verbatim?
When the function f() is called, the system puts its function call on the "call stack" and since the local int variable y is created within the function, its memory allocation is also performed in the stack space. Then x, an int pointer, is assigned the memory location of the local variable y the programmer has now saved the address to a position in the stack space. Why is this bad? Well, it is demonstrated in this example. When the next function call is made, the first function is now finished and popped off the stack, the memory space on the call stack is deallocated, in order to be used later by another function and the system does just that: it uses that space for the next function and (low and behold), it defines a local variable as well and places it in the memory location that the previous locally defined variable was held in. Thus, we have demonstrated the ability to alter a global variable by defining a local variable and assigning it a value. Why is this bad? Well, go write some large scale project and have this be one of the errors in it, debugging would be fun don't you think?
I must make a disclaimer that Derr was the one who came up with the code examples and all credit must be given to him, so uhmm... yeah, that was the disclaimer.
Are there more little interesting things like this? Of course, I'm sure there are C follies that Derr and I have never heard of, but these were the ones discussed on that day and found it interesting enough to blog about while I sit here on my couch with my Xubuntu powered iBookG4 bored out of my mind. Now are there tools out there to catch things like this in C code? Yeah, probably ... just thought this was good food for thought.
-Adam
Allow me to begin with a simple code segment:
#include
int main(int argc, char** argv){
printf("%d\n%d\n", 100==true, 1==true);
int x = 100;
if(x)
printf("true\n");
else
printf("false\n");
}
Here the standard bool type has been included and thus the macros for true and false are able to be used, as C defines it: false = 0 and true = anything not 0. And thus the statement "100==true" should evaluate to true (or as the system defaults, a 1) as well as the statement "1==true" then we go into the if statement where if(x) is translated "if x is true, then do
So what happens when this code is compiled and ran? (compiled with "gcc testbool.c -o testbool")
max@iPseudogen:~/cTheory$ ./testbool
0
1
true
Why does this happen? I'm not entirely sure, but I assume its because the logical operator == does not logically compare in the same manor, or abide by the same rules, as an if statement comparison. I would also assume this could make for some interesting program run time characteristics if one was to use a bool returning function and expect C to follow its own rules about the bool type as stated above.
Next, something slightly more complicated and brings up the point of how a programmer who does not understand the inner workings of a system could fall into error by bad programming technique:
int *x;
void f(){
int y = 1;
x = &y;
}
void g(){
int y = 400;
}
void main(){
f();
printf("%d \n", *x);
g();
printf("%d \n", *x);
}
What happens when this is compiled and run? (" gcc stackframe.c -o stackframe.c")
max@iPseudogen:~/cTheory$ ./stackframe.c
1
400
Now, why would this give different output if the printf statements are verbatim?
When the function f() is called, the system puts its function call on the "call stack" and since the local int variable y is created within the function, its memory allocation is also performed in the stack space. Then x, an int pointer, is assigned the memory location of the local variable y the programmer has now saved the address to a position in the stack space. Why is this bad? Well, it is demonstrated in this example. When the next function call is made, the first function is now finished and popped off the stack, the memory space on the call stack is deallocated, in order to be used later by another function and the system does just that: it uses that space for the next function and (low and behold), it defines a local variable as well and places it in the memory location that the previous locally defined variable was held in. Thus, we have demonstrated the ability to alter a global variable by defining a local variable and assigning it a value. Why is this bad? Well, go write some large scale project and have this be one of the errors in it, debugging would be fun don't you think?
I must make a disclaimer that Derr was the one who came up with the code examples and all credit must be given to him, so uhmm... yeah, that was the disclaimer.
Are there more little interesting things like this? Of course, I'm sure there are C follies that Derr and I have never heard of, but these were the ones discussed on that day and found it interesting enough to blog about while I sit here on my couch with my Xubuntu powered iBookG4 bored out of my mind. Now are there tools out there to catch things like this in C code? Yeah, probably ... just thought this was good food for thought.
-Adam
Tuesday, August 29, 2006
Top down approach to programming
Here we sit in a world of object oriented everything and procedural programmers are considered elitist or engineers, and here is my opinion on what is needed for an efficient programmer today:
Beginnings:
Everyone needs to start somewhere, but where? Well this is, has, and always will be a matter of opinion but from what I have read and experienced through interactions with a lot of other programmers I believe Java is the appropriate place to begin. Why? That's a good question and I believe it is because the basic syntax for assigning values to variables of primitive or "built in" data types is easily translated to other languages with very little effort along with translation of basic operations: if statements, switch statements, for loops, while loops, etc. Also if this language is taught properly the essence and power of object orientation will be in tact for the pupil to go on to learning polymorphism, inheritance, design patterns and/or good programming techniques.
Higher Education:
The second language I think someone should learn will be C++ because we are able to handle actual pointers, deeper concepts of reference vs. value passing, true system ram can be manipulated while still being able to create objects, perform simplified basic I/o and translation from imported libraries to included headers (also known as libraries) is not difficult when moving from Java.
Then on to the mother of languages known by elitist as the god send: C. I think C is essential for programmers to learn in order to understand what is actually happening behind a method call in Java like the popular whatever.toString(); Along with many other applications of how important and efficient pre-processor declarations can be, etc.
Finally, assembly language. Yes it makes you want to run and hide, it might even make you cry but it is required in my world. If you aren't at least taught the theory of gate logic and have a firm understanding of how much work is actually done in a stack with registers and memory addresses then you fail at life. Without the appreciation for what the compiler does for you so your programming language of choice is not so intolerable to code in that you rip your hair out more often then you already to, then your path in the computer science world has been lead astray and once again, you fail at life.
Real World Applications:
Now that the data structures have been thought, the pointers are understood, type casting is second nature, reference and values are no longer something that have to be thought about because you can differentiate the two as though it were child's play, recursion makes more sense than the English language, and you can tell me how your processor is actually representing floating point numbers.... You are now ready to practice what I like to call "real world programming languages" like Python, Perl, Php, Ruby, C#, VB.net, etc. Yes, we have graduated from the "Top Down Approach" and can now appreciate all the hard work our predecessors put into these robust languages. Why? Well because you understand what is actually happening when you pass python the parameter myString = "hello"; and can further appreciate the power in doing so. Now just because I have deemed these as "real world" doesn't mean that the others aren't used in the real world it just means I don't believe they should be used as teaching tools but more as industry level languages that should only be used where applicable.
Is it perfect?
No. Nothing is perfect and this is simply a blog and thus an opinion, my opinion on how I think computer science, in respect to programming languages, should be taught.
... that's today's two cents, take them for what they are worth.
Beginnings:
Everyone needs to start somewhere, but where? Well this is, has, and always will be a matter of opinion but from what I have read and experienced through interactions with a lot of other programmers I believe Java is the appropriate place to begin. Why? That's a good question and I believe it is because the basic syntax for assigning values to variables of primitive or "built in" data types is easily translated to other languages with very little effort along with translation of basic operations: if statements, switch statements, for loops, while loops, etc. Also if this language is taught properly the essence and power of object orientation will be in tact for the pupil to go on to learning polymorphism, inheritance, design patterns and/or good programming techniques.
Higher Education:
The second language I think someone should learn will be C++ because we are able to handle actual pointers, deeper concepts of reference vs. value passing, true system ram can be manipulated while still being able to create objects, perform simplified basic I/o and translation from imported libraries to included headers (also known as libraries) is not difficult when moving from Java.
Then on to the mother of languages known by elitist as the god send: C. I think C is essential for programmers to learn in order to understand what is actually happening behind a method call in Java like the popular whatever.toString(); Along with many other applications of how important and efficient pre-processor declarations can be, etc.
Finally, assembly language. Yes it makes you want to run and hide, it might even make you cry but it is required in my world. If you aren't at least taught the theory of gate logic and have a firm understanding of how much work is actually done in a stack with registers and memory addresses then you fail at life. Without the appreciation for what the compiler does for you so your programming language of choice is not so intolerable to code in that you rip your hair out more often then you already to, then your path in the computer science world has been lead astray and once again, you fail at life.
Real World Applications:
Now that the data structures have been thought, the pointers are understood, type casting is second nature, reference and values are no longer something that have to be thought about because you can differentiate the two as though it were child's play, recursion makes more sense than the English language, and you can tell me how your processor is actually representing floating point numbers.... You are now ready to practice what I like to call "real world programming languages" like Python, Perl, Php, Ruby, C#, VB.net, etc. Yes, we have graduated from the "Top Down Approach" and can now appreciate all the hard work our predecessors put into these robust languages. Why? Well because you understand what is actually happening when you pass python the parameter myString = "hello"; and can further appreciate the power in doing so. Now just because I have deemed these as "real world" doesn't mean that the others aren't used in the real world it just means I don't believe they should be used as teaching tools but more as industry level languages that should only be used where applicable.
Is it perfect?
No. Nothing is perfect and this is simply a blog and thus an opinion, my opinion on how I think computer science, in respect to programming languages, should be taught.
... that's today's two cents, take them for what they are worth.
Thursday, July 27, 2006
Commercial Linux Software .... What it needs to succeed
I've been reading a lot of technical articles at popular tech sites about software vendors writing a Linux version of their software. Great!!!
This means a few things:
1) Linux is being recognized by the industry as an entity that must be acknowledged because we are not going anywhere.
2) The suits are becoming aware of Linux in general and feel that there is a possibility of a market of profitability within the Linux world, which is something I think will contribute in a positive way. (Novell has proven that fact)
3) Users who were once upset because their favorite/most used application on other platforms is either on its way to the Linux world or is already available and this will drive the user base in a positive fashion.
What needs to be done in my opinion:
I think companies looking to create and distribute Linux software need atleast 5 developers on hand that run only Linux and each run a different distro. Each of these developers needs to be using one of the most commonly used distros in order to distribute packages that will play nice with the systems they target.
You need a developer running one of each: RHEL/CentOS/Fedora (I don't care which), SuSE, Debian, Ubuntu, and Slackware.
Why?
Because each of these operating systems account for approximately 90% of the Linux world through child/derived distros, etc. Each of these developers should be working together to write or port the software and then package it for the distro they are working on. That way when I go out and purchase this incredible piece of software for my home machine that runs Xubuntu or for my server that runs debian; I am able to run into work to my SuSE workstation and also have the convenience of the same software. Also, I am able to spread the word and nobody is left out (because even Gentoo users can install tar.gz if they must).
Is it full-proof?
Probably not, but I think that most of the community (if not all) that is willing to pay for quality software will agree that this is a wonderful idea and would make the acceptance of commercial software into the Linux world a much more fluent process.
... That's my piece, late.
This means a few things:
1) Linux is being recognized by the industry as an entity that must be acknowledged because we are not going anywhere.
2) The suits are becoming aware of Linux in general and feel that there is a possibility of a market of profitability within the Linux world, which is something I think will contribute in a positive way. (Novell has proven that fact)
3) Users who were once upset because their favorite/most used application on other platforms is either on its way to the Linux world or is already available and this will drive the user base in a positive fashion.
What needs to be done in my opinion:
I think companies looking to create and distribute Linux software need atleast 5 developers on hand that run only Linux and each run a different distro. Each of these developers needs to be using one of the most commonly used distros in order to distribute packages that will play nice with the systems they target.
You need a developer running one of each: RHEL/CentOS/Fedora (I don't care which), SuSE, Debian, Ubuntu, and Slackware.
Why?
Because each of these operating systems account for approximately 90% of the Linux world through child/derived distros, etc. Each of these developers should be working together to write or port the software and then package it for the distro they are working on. That way when I go out and purchase this incredible piece of software for my home machine that runs Xubuntu or for my server that runs debian; I am able to run into work to my SuSE workstation and also have the convenience of the same software. Also, I am able to spread the word and nobody is left out (because even Gentoo users can install tar.gz if they must).
Is it full-proof?
Probably not, but I think that most of the community (if not all) that is willing to pay for quality software will agree that this is a wonderful idea and would make the acceptance of commercial software into the Linux world a much more fluent process.
... That's my piece, late.
Monday, July 24, 2006
Why I don't like MySpace....
Here is the problem with MySpace: They are getting too popular on the web.
What happens to things that get too popular on the web and aren't stable enough to back it up?
They get bought out by Microsoft.
What will happen if MySpace gets bought out by Microsoft?
Microsoft will have an extremely popular wide open community that will become their favorite e-Advertising base and there isn't anything anyone could do about it.
Then What?
Well Microsoft would start to integrate MSN and MySpace services, your .NET password would work across all services seamlessly, now including MySpace, and thus all users of MySpace would be required to have an MSN account and vice-versa. There would no longer be a web mail login, it would simply be a control panel to access, modify, and update all of your current .NET enabled services. All of this playing into the monopolistic nature and then once again regaining social web dominance in favor of the evil blue empire. Then, before you know it there is an applet reserving real estate on the up and coming Vista applet bar and Microsoft is tracking every aspect of your web life thus sparking messages from the applet bar like so "I noticed you haven't blogged on Friendster in a couple days, nor have you added any photos to your MySpace, and your MSN mail box is filling up at an alarming rate. Wait, your coffee cup is still half full and you have only drank two cups this morning, shall I call a doctor via VoIP or should I just have Starbucks deliver another Capuccino?"
I am done writing about this and if you don't get the point yet, you never will....
..... Long live Tux.
What happens to things that get too popular on the web and aren't stable enough to back it up?
They get bought out by Microsoft.
What will happen if MySpace gets bought out by Microsoft?
Microsoft will have an extremely popular wide open community that will become their favorite e-Advertising base and there isn't anything anyone could do about it.
Then What?
Well Microsoft would start to integrate MSN and MySpace services, your .NET password would work across all services seamlessly, now including MySpace, and thus all users of MySpace would be required to have an MSN account and vice-versa. There would no longer be a web mail login, it would simply be a control panel to access, modify, and update all of your current .NET enabled services. All of this playing into the monopolistic nature and then once again regaining social web dominance in favor of the evil blue empire. Then, before you know it there is an applet reserving real estate on the up and coming Vista applet bar and Microsoft is tracking every aspect of your web life thus sparking messages from the applet bar like so "I noticed you haven't blogged on Friendster in a couple days, nor have you added any photos to your MySpace, and your MSN mail box is filling up at an alarming rate. Wait, your coffee cup is still half full and you have only drank two cups this morning, shall I call a doctor via VoIP or should I just have Starbucks deliver another Capuccino?"
I am done writing about this and if you don't get the point yet, you never will....
..... Long live Tux.
Thursday, July 20, 2006
The state of the GNU/Linux desktop, *buntu on the right track
Over the past year I have watched Ubuntu and its partner iterations spring from "that thing Canonical is doing" to the most widely used distro on the planet.
Do I mind?
Yes and No, No and Yes. At first I was greatly upset at how much credit and praise Ubuntu was getting for all of the debian community's hard work (mainly because my heart belongs to debian), but now we are in a time of playing nice and each organization progressing together with one another to create a development process that will benefit both projects as equally as possible. (Sources on this found here and here) But now that everything has been sorted out I am one happy camper and now consider myself a "debuntu" user because my server still strives on debian's incredible stability and security but my desktop reaps the ease of use benefits of the *buntu world. I say *buntu because I am an advocate of all Ubuntu flavors because each one offers all the great features as the last but along with the specified desktop environment that fits the target user the best.
What do I run?
Xubuntu. It gives me everything I ever wanted out of a desktop computer for personal, school, and work purposes and it does it all faster. The first time I mentioned to a friend that I ran Xfce on my new machine their reaction was a tad in the "shocked" state because they were under the impression it was a sub-par desktop environment in feature set and only existed for the interest of older hardware. I let him try Xfce for himself and quickly realized how far it has come and how fast it is. While I understand that Thunar is still under development and has some features that the user community would like to see put in it, even in its unfinished state it holds the crown of file managers in my book. So for me, Xubuntu is without doubt "for the win."
Have I tried all flavors of *buntu?
Sure have. Do I like them all? Of course, each one brings to the world the power of debian with the ideals of what the Ubuntu community sees as needs for the desktop along with special configurations for different desktop environment.
Is Xubuntu(Xfce) for everyone?
No, of course not. There are multiple choices because everyone has a different idea of how they want their desktop to interact with them, I just like things simple and fast. Once it is all said and done, its all about personal preference.
Conclusion:
For personal computing, I think *buntu is where the future lies. Yet I like to consider myself a realist and I must say that I believe Novell/SuSE is where corporate Linux is headed in the direction of.
Do I mind?
Yes and No, No and Yes. At first I was greatly upset at how much credit and praise Ubuntu was getting for all of the debian community's hard work (mainly because my heart belongs to debian), but now we are in a time of playing nice and each organization progressing together with one another to create a development process that will benefit both projects as equally as possible. (Sources on this found here and here) But now that everything has been sorted out I am one happy camper and now consider myself a "debuntu" user because my server still strives on debian's incredible stability and security but my desktop reaps the ease of use benefits of the *buntu world. I say *buntu because I am an advocate of all Ubuntu flavors because each one offers all the great features as the last but along with the specified desktop environment that fits the target user the best.
What do I run?
Xubuntu. It gives me everything I ever wanted out of a desktop computer for personal, school, and work purposes and it does it all faster. The first time I mentioned to a friend that I ran Xfce on my new machine their reaction was a tad in the "shocked" state because they were under the impression it was a sub-par desktop environment in feature set and only existed for the interest of older hardware. I let him try Xfce for himself and quickly realized how far it has come and how fast it is. While I understand that Thunar is still under development and has some features that the user community would like to see put in it, even in its unfinished state it holds the crown of file managers in my book. So for me, Xubuntu is without doubt "for the win."
Have I tried all flavors of *buntu?
Sure have. Do I like them all? Of course, each one brings to the world the power of debian with the ideals of what the Ubuntu community sees as needs for the desktop along with special configurations for different desktop environment.
Is Xubuntu(Xfce) for everyone?
No, of course not. There are multiple choices because everyone has a different idea of how they want their desktop to interact with them, I just like things simple and fast. Once it is all said and done, its all about personal preference.
Conclusion:
For personal computing, I think *buntu is where the future lies. Yet I like to consider myself a realist and I must say that I believe Novell/SuSE is where corporate Linux is headed in the direction of.
Friday, June 09, 2006
pseudoCube64 lives ...
Well my new box is up and running, debian-amd64 official sid release and i must say it is blistering fast, but it also is coming along in stride towards the testing branch and then on to stable ... sid is currently the only working official branch of debian on the amd64 architecture to my knowledge and it runs xfce4 like a dream.
specs:
-Athlon64 3200+ Venice Core
-1GB low latency ram (i forget what brand and model)
-180GB SATA hdd
- nVidia GeForce 6200
... i know, nothing stellar but it is one hell of a jump over the pentiumIII box formerly known as pseudogen, which unfortunately had a power source failure reciently.
/me
specs:
-Athlon64 3200+ Venice Core
-1GB low latency ram (i forget what brand and model)
-180GB SATA hdd
- nVidia GeForce 6200
... i know, nothing stellar but it is one hell of a jump over the pentiumIII box formerly known as pseudogen, which unfortunately had a power source failure reciently.
/me
Wednesday, May 31, 2006
my iBuntu-G4
Well, I am currently posting from my CS470 class on my iBook G4 running Ubuntu Dapper Drake (RC), it is supposed to be stable in the next few days, and I must say that this does nothing short of kick ass. Everything functioned flawlessly (minus the Airport Extreme, but that was expected and the beta driver functions rather well) and I couldn't be happier with it. .... i dunno, just posting to post....
/me
/me
Monday, March 27, 2006
PDP-11 ... dead, but never forgotten.
The PDP-11 is/was a processor/processor architecture (to my knowledge, there was only the one before it went to the way of the dinosaurs) that at one time dominated large scale main frames and dumb terminals at a blistering 4MHz or 8MHz option. Now thats no news, nobody cares about old processors that can't keep up with cell phone processors these days, but one of my computer science professors lived in the binari code of this chip and decided to bring it up in my processor architectures 2 class and went over the rather unique fact that it took all of its opcode instructions in octal, a very alienated way of doing things but since it was a 16-bit microprocessor it actually makes sense, and since my prof LIVES for this thing; it has been brought to my attention that our next assignment will be to code an emulator for it which i am strangely excited about doing. So once that is complete, i will be posting a link to a place you can download my emulator if you have any reason to want to.
... also, here is a link i google'd very quickly but haven't read over, though it looks interesting towards the architecture.PDP-11.org
/me
... also, here is a link i google'd very quickly but haven't read over, though it looks interesting towards the architecture.PDP-11.org
/me
Saturday, February 18, 2006
fscking nUbs
it kills me how many people talk about how their store bought machines or even custom machines dominate so much because they have a 3.8Ghz proc or multiple gigs of ram or some massive hdd when about 85-95% of these people have no clue what they are talking about. it isn't rocket science to put together a computer these days but how many of these "computer geeks" around here who claim to know that they are doing have any idea about their chipset, their north/south bridge, their memory bandwidth or its latency?
....ok, thats my rant ... i'm done.
....ok, thats my rant ... i'm done.
Tuesday, October 04, 2005
Open Source Community and GNU/Linux essay ... 0wn3d
“Liberation from the past. Freedom from outdated ideas.” The known world is concurrently run by digital networks of inter-operating machines constantly and consistently transmitting data in order to keep our hectic lives in functioning order. One company, Penguin Computing, has taken this truth to heart and appealed to the customers logical, ethical, and emotional aspects of the decision making thought process through offering leading edge GNU/Linux solutions.
In our world, computers run our lives. Within this world there are big players in the Corporate aspect of the computer driven world: Microsoft, IBM, HP, Dell, Apple, Intel, AMD, and many others. Yet, in rebuttal to the large corporate network, an Open Source Community formed, comprised of computer scientists, software engineers, and hackers forged together to bring the leading in Information Technology to the world with an open mind and open standards. The logical attraction to this concept is simply the fact that when a large community of abstract brilliant thinkers comes together, they produce results. Because there is no corporation pressuring dead lines, the results produced are “done right” with stability, security, and performance in mind. Within this association the GNU/Linux operating system spawned and has, in the past 10 years, proven its worth within Enterprise Information Technology solutions and is now making its strategic move to the home desktop. Simply put, Linux is on the rise for greatness and Penguin Computing is not only helping move it along, but preparing for what tomorrows innovations shall bring. The performance and stability offered by the Linux kernel is simply unparalleled, thus making it the perfect base for such a advanced technological movement that will bring us into the next phase of this hectic life and even those phases in the future to come. The advertisement for moving to the technological world that is GNU/Linux, or more commonly known as just “Linux,” is appealing to the logical aspect of the human brain in the form of a decree. Stating that One will no longer be trapped by the old corporate owned Operating Systems, but will be freed and liberated by the Open Source Community to have choice and experience real world computing in an all new manner.
During the course of each individual's life, they develop their own set of ethics. In the life of an IT Professional, they develop a concept about the freedom of ones mind: the true spirit of a hacker. This ethical trademark will very quickly pave the path to a Linux solution and, as stated within the page found in the magazine and on their website, Penguin Computing is here to cater to those needing guidance in areas from corporate server and data center side all the way to the home desktop. This open minded apprehension brings on the ethical commitment to the community that features such technologies and elucidations. Drawing attention to the unfortunate fact that monopolistic companies like Microsoft offer lesser server systems as multiple thousand dollar “top of the line” computers while those informed and educated in the field are drawn to, not only the concept of but, the actuality of what Open Source offers and how much more advanced of a manner in which it is offered. Not to mention the cost efficiency due to the fact that Open Source Software is freely distributable and redistributable by law under the GNU GPL (General Public License) and all that requires money is support and administration; something already required to run even those Corporate licensed and developed Operating Systems.
Every person not only has their mental capacity to render decisions based on logical and ethical stand points, but many decisions are influenced by One's emotions about a certain topic. In the world of Open Source, many of its users, supporters, and developers are very passionate about being true to the ideals that Open Source portrays within its community. Penguin Computing grasps and understands these concepts and advertises in favor of these feelings of loyalty by recognizing the technological movement in a political rendition. Showing the Penguin, the GNU/Linux mascot, standing along side the founders of the single most powerful nation in the world puts forth the emotional bridge between the foundation of this country and the loyalty that goes along with the next generation of computing foundation with Linux and the loyalty that accompanies it. This concept is understood put forth in a logical sense to appeal to the emotional aspect of decision making and then transported to possibility of a purchase of a computer system is created due to the knowledge that emotion is generally one of the strongest tools to sway a choice in the human thought process.
We are now liberated from the past, and have achieved freedom from outdated ideas. With the help of Penguin Computing, the Open Source community and the GNU/Linux Operating System appealing to the most influential sections of the persuasive method as declared by Aristotle: logic, ethics, and emotion.
In our world, computers run our lives. Within this world there are big players in the Corporate aspect of the computer driven world: Microsoft, IBM, HP, Dell, Apple, Intel, AMD, and many others. Yet, in rebuttal to the large corporate network, an Open Source Community formed, comprised of computer scientists, software engineers, and hackers forged together to bring the leading in Information Technology to the world with an open mind and open standards. The logical attraction to this concept is simply the fact that when a large community of abstract brilliant thinkers comes together, they produce results. Because there is no corporation pressuring dead lines, the results produced are “done right” with stability, security, and performance in mind. Within this association the GNU/Linux operating system spawned and has, in the past 10 years, proven its worth within Enterprise Information Technology solutions and is now making its strategic move to the home desktop. Simply put, Linux is on the rise for greatness and Penguin Computing is not only helping move it along, but preparing for what tomorrows innovations shall bring. The performance and stability offered by the Linux kernel is simply unparalleled, thus making it the perfect base for such a advanced technological movement that will bring us into the next phase of this hectic life and even those phases in the future to come. The advertisement for moving to the technological world that is GNU/Linux, or more commonly known as just “Linux,” is appealing to the logical aspect of the human brain in the form of a decree. Stating that One will no longer be trapped by the old corporate owned Operating Systems, but will be freed and liberated by the Open Source Community to have choice and experience real world computing in an all new manner.
During the course of each individual's life, they develop their own set of ethics. In the life of an IT Professional, they develop a concept about the freedom of ones mind: the true spirit of a hacker. This ethical trademark will very quickly pave the path to a Linux solution and, as stated within the page found in the magazine and on their website, Penguin Computing is here to cater to those needing guidance in areas from corporate server and data center side all the way to the home desktop. This open minded apprehension brings on the ethical commitment to the community that features such technologies and elucidations. Drawing attention to the unfortunate fact that monopolistic companies like Microsoft offer lesser server systems as multiple thousand dollar “top of the line” computers while those informed and educated in the field are drawn to, not only the concept of but, the actuality of what Open Source offers and how much more advanced of a manner in which it is offered. Not to mention the cost efficiency due to the fact that Open Source Software is freely distributable and redistributable by law under the GNU GPL (General Public License) and all that requires money is support and administration; something already required to run even those Corporate licensed and developed Operating Systems.
Every person not only has their mental capacity to render decisions based on logical and ethical stand points, but many decisions are influenced by One's emotions about a certain topic. In the world of Open Source, many of its users, supporters, and developers are very passionate about being true to the ideals that Open Source portrays within its community. Penguin Computing grasps and understands these concepts and advertises in favor of these feelings of loyalty by recognizing the technological movement in a political rendition. Showing the Penguin, the GNU/Linux mascot, standing along side the founders of the single most powerful nation in the world puts forth the emotional bridge between the foundation of this country and the loyalty that goes along with the next generation of computing foundation with Linux and the loyalty that accompanies it. This concept is understood put forth in a logical sense to appeal to the emotional aspect of decision making and then transported to possibility of a purchase of a computer system is created due to the knowledge that emotion is generally one of the strongest tools to sway a choice in the human thought process.
We are now liberated from the past, and have achieved freedom from outdated ideas. With the help of Penguin Computing, the Open Source community and the GNU/Linux Operating System appealing to the most influential sections of the persuasive method as declared by Aristotle: logic, ethics, and emotion.
Saturday, July 16, 2005
The Evil Blue Empires ....
Note to the world, Best Buy not only sucks but sucks even worse to work for ....
P.S. - Microsoft still sucks just a little bit harder than Best Buy.
/me
P.S. - Microsoft still sucks just a little bit harder than Best Buy.
/me
Saturday, July 02, 2005
Ubuntu Linux dominates the PC world....
I find myself killing time at Barnes and Noble every now and then. This last trip to the cafe portion of the store, I sit down with a cafeine fix and a few magazines, LinuxWorld Mag., Linux Mag., and PC World. PC World has a list of 2005's top 100 products and guess what Ubuntu Linux scores? #19 .... a linux distro has invaded the Windows world and is kicking ass while taking names; needless to say, I am very amused and satisfied by this for many reasons and one of the main reasons being the fact that Ubuntu is a Debian derivitive.
So I flip open LinuxWorld Mag. next and it is kind enough to have a column giving a rather large amount of props to Debian for being the most stable, most un-talked about bad ass distro around and informs most people that Debian's inovation is widely used, but most people don't realize that when they use Ubuntu, MEPIS, Knoppix, Damn Small, etc. that they are actually using a Debian system that has been customized for whatever each distro aims at. Next the article goes on to exclaim that the apt system that has been around in Debian for many years is just reciently being ported to distros like SuSE, Fedora Core, Mandrake, and other rpm based distros by using the new apt4rpm system that is taking the linux world by storm ... so what is the moral of this blog? the simple fact that Debian dominates all, always has and always will ... thus is why I have their trademark swirl tattooed on my left ass cheak and running on my server ... that's right ... got a problem with it? kiss my tattoo .... and my server :)
/me
So I flip open LinuxWorld Mag. next and it is kind enough to have a column giving a rather large amount of props to Debian for being the most stable, most un-talked about bad ass distro around and informs most people that Debian's inovation is widely used, but most people don't realize that when they use Ubuntu, MEPIS, Knoppix, Damn Small, etc. that they are actually using a Debian system that has been customized for whatever each distro aims at. Next the article goes on to exclaim that the apt system that has been around in Debian for many years is just reciently being ported to distros like SuSE, Fedora Core, Mandrake, and other rpm based distros by using the new apt4rpm system that is taking the linux world by storm ... so what is the moral of this blog? the simple fact that Debian dominates all, always has and always will ... thus is why I have their trademark swirl tattooed on my left ass cheak and running on my server ... that's right ... got a problem with it? kiss my tattoo .... and my server :)
/me
Tuesday, April 05, 2005
My English Class....
I currently sit infront of a dumb terminal (and I use the term loosely) in English 165 taught by Rita Raju, the stupid bitch that has yet to merit a degree, but for some reason they are allowing her to teach us.
First, I want to go off on a tangent about the damned terminal I sit infront of. Not only does this peice of shit run on WinXP but it is also a Dell. I don't have a big beef with Dell, it just annoys those of us who got excited when they claimed to offer Linux as an alternate OS when purchasing hardware from them and then renigging on the offer. Then we shall get to the machine itself, they put far too much money into CPU speed and not enough into amount of RAM, the damned thing is running somewhere around a 2.8GHz P4 and only has 256MB of RAM. My workstation in my dorm has the same amount of RAM, but I am also only running a 1GHz PIII and well ... Linux can manage memory much better than this interesting chunk of code. Then we must focus on the interesting fact that the network (WinStall) packages are managed by students who aren't entirely fluent in the working of a network nor their protocols, thus making one think that the network would be sluggish at best.
Now, back to the stupid bitch standing infront of me. I have an issue with a human being who claims to have been a teacher of English for 15 years and yet still can't speak it worth a shit, I seriously want to throw something at her with the hope to correct her speach upon impact. Her grammar is annoying and she enjoys telling me that my essays are plagorized (spelling?) because I am too lazy to make a works cited, FUCK HER. She is the most contradictory human being on the planet and there ins't a single student who respects her, enjoys her class, or can take her seriously. If she was not in control of my grades, I would without a doubt, 0wn the shit out of her.
/me
First, I want to go off on a tangent about the damned terminal I sit infront of. Not only does this peice of shit run on WinXP but it is also a Dell. I don't have a big beef with Dell, it just annoys those of us who got excited when they claimed to offer Linux as an alternate OS when purchasing hardware from them and then renigging on the offer. Then we shall get to the machine itself, they put far too much money into CPU speed and not enough into amount of RAM, the damned thing is running somewhere around a 2.8GHz P4 and only has 256MB of RAM. My workstation in my dorm has the same amount of RAM, but I am also only running a 1GHz PIII and well ... Linux can manage memory much better than this interesting chunk of code. Then we must focus on the interesting fact that the network (WinStall) packages are managed by students who aren't entirely fluent in the working of a network nor their protocols, thus making one think that the network would be sluggish at best.
Now, back to the stupid bitch standing infront of me. I have an issue with a human being who claims to have been a teacher of English for 15 years and yet still can't speak it worth a shit, I seriously want to throw something at her with the hope to correct her speach upon impact. Her grammar is annoying and she enjoys telling me that my essays are plagorized (spelling?) because I am too lazy to make a works cited, FUCK HER. She is the most contradictory human being on the planet and there ins't a single student who respects her, enjoys her class, or can take her seriously. If she was not in control of my grades, I would without a doubt, 0wn the shit out of her.
/me
Tuesday, March 08, 2005
Testing... 1... 2...
Ok, and thus my first rant. I attempt to sit down a few days ago to write something intelligent about the fact that my University's definition of Computer Science for Non-majors is rather pathetic and low and behold my new found "google-zon" service felt the need to deny my the ability to update my blog. Not to mention the second review I plan to do in the (hopefully) not so near future about the new google file system that in my belief is the first step towards the Google-Grid, and the fall of modern news, fact, and civilization. But no worries, by then I hope that by then we will all be intelligent enough for our opinions to be worth while for spectators. Oh, and I will also be publishing my friend's rant about our campus food and will make note that it is his writing, but I can't not put it out on the web for the world to see. Ok, well until next time.... $exit
/me
/me
Wednesday, February 23, 2005
A new beginning
On this day we embrace not only the start of a new blog, but the beginning of my own personal rebirth into the world in which I find myself. I named my journal after my computer, cheesy you say? Then you obviously don't know me that well. Anyways, I will do many random things with this blog, everything from ranting about life to writing reviews about certain technologies from cell phones to super computers, it will prove to be quite the compilation of randomness.
/me
/me
Subscribe to:
Comments (Atom)