I doubt anyone reads my blog, but just in case you do. This link is too cool to miss.
http://www.100mb.nl/
Sunday, December 31, 2006
Sunday, November 19, 2006
C Follies:
So here I set the stage for what can only be called a "Geek Moment." I find myself sitting at work like a good student employee, but like every other Sys Tech on the planet with nothing to repair, I look to the Internet for entertainment and there I find the all mighty instant messenger. On this interesting communication tool, I converse with a very good friend of mine who will just be referred to as Derr in order to reserve some level of confidentiality. Derr and I, both computer science majors but at different Universities, often find ourselves talking about computer related topics and on this day we spoke about the C programming language and its follies. The main issue with C's follies is that they are not so much a problem with the programming language, its the fact that the average programmer lacks the understanding of what is actually happening behind the scenes and have been spoiled with languages like Java that will raise a compiler error when trying to do something that could later be viewed as "stupid" by the system.
Allow me to begin with a simple code segment:
Here the standard bool type has been included and thus the macros for true and false are able to be used, as C defines it: false = 0 and true = anything not 0. And thus the statement "100==true" should evaluate to true (or as the system defaults, a 1) as well as the statement "1==true" then we go into the if statement where if(x) is translated "if x is true, then do." Which should print out the string value "true" or so most would think from their first look at the code, but in fact that is not how C evaluates this expression.
So what happens when this code is compiled and ran? (compiled with "gcc testbool.c -o testbool")
Why does this happen? I'm not entirely sure, but I assume its because the logical operator == does not logically compare in the same manor, or abide by the same rules, as an if statement comparison. I would also assume this could make for some interesting program run time characteristics if one was to use a bool returning function and expect C to follow its own rules about the bool type as stated above.
Next, something slightly more complicated and brings up the point of how a programmer who does not understand the inner workings of a system could fall into error by bad programming technique:
What happens when this is compiled and run? (" gcc stackframe.c -o stackframe.c")
Now, why would this give different output if the printf statements are verbatim?
When the function f() is called, the system puts its function call on the "call stack" and since the local int variable y is created within the function, its memory allocation is also performed in the stack space. Then x, an int pointer, is assigned the memory location of the local variable y the programmer has now saved the address to a position in the stack space. Why is this bad? Well, it is demonstrated in this example. When the next function call is made, the first function is now finished and popped off the stack, the memory space on the call stack is deallocated, in order to be used later by another function and the system does just that: it uses that space for the next function and (low and behold), it defines a local variable as well and places it in the memory location that the previous locally defined variable was held in. Thus, we have demonstrated the ability to alter a global variable by defining a local variable and assigning it a value. Why is this bad? Well, go write some large scale project and have this be one of the errors in it, debugging would be fun don't you think?
I must make a disclaimer that Derr was the one who came up with the code examples and all credit must be given to him, so uhmm... yeah, that was the disclaimer.
Are there more little interesting things like this? Of course, I'm sure there are C follies that Derr and I have never heard of, but these were the ones discussed on that day and found it interesting enough to blog about while I sit here on my couch with my Xubuntu powered iBookG4 bored out of my mind. Now are there tools out there to catch things like this in C code? Yeah, probably ... just thought this was good food for thought.
-Adam
Allow me to begin with a simple code segment:
#include
int main(int argc, char** argv){
printf("%d\n%d\n", 100==true, 1==true);
int x = 100;
if(x)
printf("true\n");
else
printf("false\n");
}
Here the standard bool type has been included and thus the macros for true and false are able to be used, as C defines it: false = 0 and true = anything not 0. And thus the statement "100==true" should evaluate to true (or as the system defaults, a 1) as well as the statement "1==true" then we go into the if statement where if(x) is translated "if x is true, then do
So what happens when this code is compiled and ran? (compiled with "gcc testbool.c -o testbool")
max@iPseudogen:~/cTheory$ ./testbool
0
1
true
Why does this happen? I'm not entirely sure, but I assume its because the logical operator == does not logically compare in the same manor, or abide by the same rules, as an if statement comparison. I would also assume this could make for some interesting program run time characteristics if one was to use a bool returning function and expect C to follow its own rules about the bool type as stated above.
Next, something slightly more complicated and brings up the point of how a programmer who does not understand the inner workings of a system could fall into error by bad programming technique:
int *x;
void f(){
int y = 1;
x = &y;
}
void g(){
int y = 400;
}
void main(){
f();
printf("%d \n", *x);
g();
printf("%d \n", *x);
}
What happens when this is compiled and run? (" gcc stackframe.c -o stackframe.c")
max@iPseudogen:~/cTheory$ ./stackframe.c
1
400
Now, why would this give different output if the printf statements are verbatim?
When the function f() is called, the system puts its function call on the "call stack" and since the local int variable y is created within the function, its memory allocation is also performed in the stack space. Then x, an int pointer, is assigned the memory location of the local variable y the programmer has now saved the address to a position in the stack space. Why is this bad? Well, it is demonstrated in this example. When the next function call is made, the first function is now finished and popped off the stack, the memory space on the call stack is deallocated, in order to be used later by another function and the system does just that: it uses that space for the next function and (low and behold), it defines a local variable as well and places it in the memory location that the previous locally defined variable was held in. Thus, we have demonstrated the ability to alter a global variable by defining a local variable and assigning it a value. Why is this bad? Well, go write some large scale project and have this be one of the errors in it, debugging would be fun don't you think?
I must make a disclaimer that Derr was the one who came up with the code examples and all credit must be given to him, so uhmm... yeah, that was the disclaimer.
Are there more little interesting things like this? Of course, I'm sure there are C follies that Derr and I have never heard of, but these were the ones discussed on that day and found it interesting enough to blog about while I sit here on my couch with my Xubuntu powered iBookG4 bored out of my mind. Now are there tools out there to catch things like this in C code? Yeah, probably ... just thought this was good food for thought.
-Adam
Tuesday, August 29, 2006
Top down approach to programming
Here we sit in a world of object oriented everything and procedural programmers are considered elitist or engineers, and here is my opinion on what is needed for an efficient programmer today:
Beginnings:
Everyone needs to start somewhere, but where? Well this is, has, and always will be a matter of opinion but from what I have read and experienced through interactions with a lot of other programmers I believe Java is the appropriate place to begin. Why? That's a good question and I believe it is because the basic syntax for assigning values to variables of primitive or "built in" data types is easily translated to other languages with very little effort along with translation of basic operations: if statements, switch statements, for loops, while loops, etc. Also if this language is taught properly the essence and power of object orientation will be in tact for the pupil to go on to learning polymorphism, inheritance, design patterns and/or good programming techniques.
Higher Education:
The second language I think someone should learn will be C++ because we are able to handle actual pointers, deeper concepts of reference vs. value passing, true system ram can be manipulated while still being able to create objects, perform simplified basic I/o and translation from imported libraries to included headers (also known as libraries) is not difficult when moving from Java.
Then on to the mother of languages known by elitist as the god send: C. I think C is essential for programmers to learn in order to understand what is actually happening behind a method call in Java like the popular whatever.toString(); Along with many other applications of how important and efficient pre-processor declarations can be, etc.
Finally, assembly language. Yes it makes you want to run and hide, it might even make you cry but it is required in my world. If you aren't at least taught the theory of gate logic and have a firm understanding of how much work is actually done in a stack with registers and memory addresses then you fail at life. Without the appreciation for what the compiler does for you so your programming language of choice is not so intolerable to code in that you rip your hair out more often then you already to, then your path in the computer science world has been lead astray and once again, you fail at life.
Real World Applications:
Now that the data structures have been thought, the pointers are understood, type casting is second nature, reference and values are no longer something that have to be thought about because you can differentiate the two as though it were child's play, recursion makes more sense than the English language, and you can tell me how your processor is actually representing floating point numbers.... You are now ready to practice what I like to call "real world programming languages" like Python, Perl, Php, Ruby, C#, VB.net, etc. Yes, we have graduated from the "Top Down Approach" and can now appreciate all the hard work our predecessors put into these robust languages. Why? Well because you understand what is actually happening when you pass python the parameter myString = "hello"; and can further appreciate the power in doing so. Now just because I have deemed these as "real world" doesn't mean that the others aren't used in the real world it just means I don't believe they should be used as teaching tools but more as industry level languages that should only be used where applicable.
Is it perfect?
No. Nothing is perfect and this is simply a blog and thus an opinion, my opinion on how I think computer science, in respect to programming languages, should be taught.
... that's today's two cents, take them for what they are worth.
Beginnings:
Everyone needs to start somewhere, but where? Well this is, has, and always will be a matter of opinion but from what I have read and experienced through interactions with a lot of other programmers I believe Java is the appropriate place to begin. Why? That's a good question and I believe it is because the basic syntax for assigning values to variables of primitive or "built in" data types is easily translated to other languages with very little effort along with translation of basic operations: if statements, switch statements, for loops, while loops, etc. Also if this language is taught properly the essence and power of object orientation will be in tact for the pupil to go on to learning polymorphism, inheritance, design patterns and/or good programming techniques.
Higher Education:
The second language I think someone should learn will be C++ because we are able to handle actual pointers, deeper concepts of reference vs. value passing, true system ram can be manipulated while still being able to create objects, perform simplified basic I/o and translation from imported libraries to included headers (also known as libraries) is not difficult when moving from Java.
Then on to the mother of languages known by elitist as the god send: C. I think C is essential for programmers to learn in order to understand what is actually happening behind a method call in Java like the popular whatever.toString(); Along with many other applications of how important and efficient pre-processor declarations can be, etc.
Finally, assembly language. Yes it makes you want to run and hide, it might even make you cry but it is required in my world. If you aren't at least taught the theory of gate logic and have a firm understanding of how much work is actually done in a stack with registers and memory addresses then you fail at life. Without the appreciation for what the compiler does for you so your programming language of choice is not so intolerable to code in that you rip your hair out more often then you already to, then your path in the computer science world has been lead astray and once again, you fail at life.
Real World Applications:
Now that the data structures have been thought, the pointers are understood, type casting is second nature, reference and values are no longer something that have to be thought about because you can differentiate the two as though it were child's play, recursion makes more sense than the English language, and you can tell me how your processor is actually representing floating point numbers.... You are now ready to practice what I like to call "real world programming languages" like Python, Perl, Php, Ruby, C#, VB.net, etc. Yes, we have graduated from the "Top Down Approach" and can now appreciate all the hard work our predecessors put into these robust languages. Why? Well because you understand what is actually happening when you pass python the parameter myString = "hello"; and can further appreciate the power in doing so. Now just because I have deemed these as "real world" doesn't mean that the others aren't used in the real world it just means I don't believe they should be used as teaching tools but more as industry level languages that should only be used where applicable.
Is it perfect?
No. Nothing is perfect and this is simply a blog and thus an opinion, my opinion on how I think computer science, in respect to programming languages, should be taught.
... that's today's two cents, take them for what they are worth.
Thursday, July 27, 2006
Commercial Linux Software .... What it needs to succeed
I've been reading a lot of technical articles at popular tech sites about software vendors writing a Linux version of their software. Great!!!
This means a few things:
1) Linux is being recognized by the industry as an entity that must be acknowledged because we are not going anywhere.
2) The suits are becoming aware of Linux in general and feel that there is a possibility of a market of profitability within the Linux world, which is something I think will contribute in a positive way. (Novell has proven that fact)
3) Users who were once upset because their favorite/most used application on other platforms is either on its way to the Linux world or is already available and this will drive the user base in a positive fashion.
What needs to be done in my opinion:
I think companies looking to create and distribute Linux software need atleast 5 developers on hand that run only Linux and each run a different distro. Each of these developers needs to be using one of the most commonly used distros in order to distribute packages that will play nice with the systems they target.
You need a developer running one of each: RHEL/CentOS/Fedora (I don't care which), SuSE, Debian, Ubuntu, and Slackware.
Why?
Because each of these operating systems account for approximately 90% of the Linux world through child/derived distros, etc. Each of these developers should be working together to write or port the software and then package it for the distro they are working on. That way when I go out and purchase this incredible piece of software for my home machine that runs Xubuntu or for my server that runs debian; I am able to run into work to my SuSE workstation and also have the convenience of the same software. Also, I am able to spread the word and nobody is left out (because even Gentoo users can install tar.gz if they must).
Is it full-proof?
Probably not, but I think that most of the community (if not all) that is willing to pay for quality software will agree that this is a wonderful idea and would make the acceptance of commercial software into the Linux world a much more fluent process.
... That's my piece, late.
This means a few things:
1) Linux is being recognized by the industry as an entity that must be acknowledged because we are not going anywhere.
2) The suits are becoming aware of Linux in general and feel that there is a possibility of a market of profitability within the Linux world, which is something I think will contribute in a positive way. (Novell has proven that fact)
3) Users who were once upset because their favorite/most used application on other platforms is either on its way to the Linux world or is already available and this will drive the user base in a positive fashion.
What needs to be done in my opinion:
I think companies looking to create and distribute Linux software need atleast 5 developers on hand that run only Linux and each run a different distro. Each of these developers needs to be using one of the most commonly used distros in order to distribute packages that will play nice with the systems they target.
You need a developer running one of each: RHEL/CentOS/Fedora (I don't care which), SuSE, Debian, Ubuntu, and Slackware.
Why?
Because each of these operating systems account for approximately 90% of the Linux world through child/derived distros, etc. Each of these developers should be working together to write or port the software and then package it for the distro they are working on. That way when I go out and purchase this incredible piece of software for my home machine that runs Xubuntu or for my server that runs debian; I am able to run into work to my SuSE workstation and also have the convenience of the same software. Also, I am able to spread the word and nobody is left out (because even Gentoo users can install tar.gz if they must).
Is it full-proof?
Probably not, but I think that most of the community (if not all) that is willing to pay for quality software will agree that this is a wonderful idea and would make the acceptance of commercial software into the Linux world a much more fluent process.
... That's my piece, late.
Monday, July 24, 2006
Why I don't like MySpace....
Here is the problem with MySpace: They are getting too popular on the web.
What happens to things that get too popular on the web and aren't stable enough to back it up?
They get bought out by Microsoft.
What will happen if MySpace gets bought out by Microsoft?
Microsoft will have an extremely popular wide open community that will become their favorite e-Advertising base and there isn't anything anyone could do about it.
Then What?
Well Microsoft would start to integrate MSN and MySpace services, your .NET password would work across all services seamlessly, now including MySpace, and thus all users of MySpace would be required to have an MSN account and vice-versa. There would no longer be a web mail login, it would simply be a control panel to access, modify, and update all of your current .NET enabled services. All of this playing into the monopolistic nature and then once again regaining social web dominance in favor of the evil blue empire. Then, before you know it there is an applet reserving real estate on the up and coming Vista applet bar and Microsoft is tracking every aspect of your web life thus sparking messages from the applet bar like so "I noticed you haven't blogged on Friendster in a couple days, nor have you added any photos to your MySpace, and your MSN mail box is filling up at an alarming rate. Wait, your coffee cup is still half full and you have only drank two cups this morning, shall I call a doctor via VoIP or should I just have Starbucks deliver another Capuccino?"
I am done writing about this and if you don't get the point yet, you never will....
..... Long live Tux.
What happens to things that get too popular on the web and aren't stable enough to back it up?
They get bought out by Microsoft.
What will happen if MySpace gets bought out by Microsoft?
Microsoft will have an extremely popular wide open community that will become their favorite e-Advertising base and there isn't anything anyone could do about it.
Then What?
Well Microsoft would start to integrate MSN and MySpace services, your .NET password would work across all services seamlessly, now including MySpace, and thus all users of MySpace would be required to have an MSN account and vice-versa. There would no longer be a web mail login, it would simply be a control panel to access, modify, and update all of your current .NET enabled services. All of this playing into the monopolistic nature and then once again regaining social web dominance in favor of the evil blue empire. Then, before you know it there is an applet reserving real estate on the up and coming Vista applet bar and Microsoft is tracking every aspect of your web life thus sparking messages from the applet bar like so "I noticed you haven't blogged on Friendster in a couple days, nor have you added any photos to your MySpace, and your MSN mail box is filling up at an alarming rate. Wait, your coffee cup is still half full and you have only drank two cups this morning, shall I call a doctor via VoIP or should I just have Starbucks deliver another Capuccino?"
I am done writing about this and if you don't get the point yet, you never will....
..... Long live Tux.
Thursday, July 20, 2006
The state of the GNU/Linux desktop, *buntu on the right track
Over the past year I have watched Ubuntu and its partner iterations spring from "that thing Canonical is doing" to the most widely used distro on the planet.
Do I mind?
Yes and No, No and Yes. At first I was greatly upset at how much credit and praise Ubuntu was getting for all of the debian community's hard work (mainly because my heart belongs to debian), but now we are in a time of playing nice and each organization progressing together with one another to create a development process that will benefit both projects as equally as possible. (Sources on this found here and here) But now that everything has been sorted out I am one happy camper and now consider myself a "debuntu" user because my server still strives on debian's incredible stability and security but my desktop reaps the ease of use benefits of the *buntu world. I say *buntu because I am an advocate of all Ubuntu flavors because each one offers all the great features as the last but along with the specified desktop environment that fits the target user the best.
What do I run?
Xubuntu. It gives me everything I ever wanted out of a desktop computer for personal, school, and work purposes and it does it all faster. The first time I mentioned to a friend that I ran Xfce on my new machine their reaction was a tad in the "shocked" state because they were under the impression it was a sub-par desktop environment in feature set and only existed for the interest of older hardware. I let him try Xfce for himself and quickly realized how far it has come and how fast it is. While I understand that Thunar is still under development and has some features that the user community would like to see put in it, even in its unfinished state it holds the crown of file managers in my book. So for me, Xubuntu is without doubt "for the win."
Have I tried all flavors of *buntu?
Sure have. Do I like them all? Of course, each one brings to the world the power of debian with the ideals of what the Ubuntu community sees as needs for the desktop along with special configurations for different desktop environment.
Is Xubuntu(Xfce) for everyone?
No, of course not. There are multiple choices because everyone has a different idea of how they want their desktop to interact with them, I just like things simple and fast. Once it is all said and done, its all about personal preference.
Conclusion:
For personal computing, I think *buntu is where the future lies. Yet I like to consider myself a realist and I must say that I believe Novell/SuSE is where corporate Linux is headed in the direction of.
Do I mind?
Yes and No, No and Yes. At first I was greatly upset at how much credit and praise Ubuntu was getting for all of the debian community's hard work (mainly because my heart belongs to debian), but now we are in a time of playing nice and each organization progressing together with one another to create a development process that will benefit both projects as equally as possible. (Sources on this found here and here) But now that everything has been sorted out I am one happy camper and now consider myself a "debuntu" user because my server still strives on debian's incredible stability and security but my desktop reaps the ease of use benefits of the *buntu world. I say *buntu because I am an advocate of all Ubuntu flavors because each one offers all the great features as the last but along with the specified desktop environment that fits the target user the best.
What do I run?
Xubuntu. It gives me everything I ever wanted out of a desktop computer for personal, school, and work purposes and it does it all faster. The first time I mentioned to a friend that I ran Xfce on my new machine their reaction was a tad in the "shocked" state because they were under the impression it was a sub-par desktop environment in feature set and only existed for the interest of older hardware. I let him try Xfce for himself and quickly realized how far it has come and how fast it is. While I understand that Thunar is still under development and has some features that the user community would like to see put in it, even in its unfinished state it holds the crown of file managers in my book. So for me, Xubuntu is without doubt "for the win."
Have I tried all flavors of *buntu?
Sure have. Do I like them all? Of course, each one brings to the world the power of debian with the ideals of what the Ubuntu community sees as needs for the desktop along with special configurations for different desktop environment.
Is Xubuntu(Xfce) for everyone?
No, of course not. There are multiple choices because everyone has a different idea of how they want their desktop to interact with them, I just like things simple and fast. Once it is all said and done, its all about personal preference.
Conclusion:
For personal computing, I think *buntu is where the future lies. Yet I like to consider myself a realist and I must say that I believe Novell/SuSE is where corporate Linux is headed in the direction of.
Friday, June 09, 2006
pseudoCube64 lives ...
Well my new box is up and running, debian-amd64 official sid release and i must say it is blistering fast, but it also is coming along in stride towards the testing branch and then on to stable ... sid is currently the only working official branch of debian on the amd64 architecture to my knowledge and it runs xfce4 like a dream.
specs:
-Athlon64 3200+ Venice Core
-1GB low latency ram (i forget what brand and model)
-180GB SATA hdd
- nVidia GeForce 6200
... i know, nothing stellar but it is one hell of a jump over the pentiumIII box formerly known as pseudogen, which unfortunately had a power source failure reciently.
/me
specs:
-Athlon64 3200+ Venice Core
-1GB low latency ram (i forget what brand and model)
-180GB SATA hdd
- nVidia GeForce 6200
... i know, nothing stellar but it is one hell of a jump over the pentiumIII box formerly known as pseudogen, which unfortunately had a power source failure reciently.
/me
Wednesday, May 31, 2006
my iBuntu-G4
Well, I am currently posting from my CS470 class on my iBook G4 running Ubuntu Dapper Drake (RC), it is supposed to be stable in the next few days, and I must say that this does nothing short of kick ass. Everything functioned flawlessly (minus the Airport Extreme, but that was expected and the beta driver functions rather well) and I couldn't be happier with it. .... i dunno, just posting to post....
/me
/me
Monday, March 27, 2006
PDP-11 ... dead, but never forgotten.
The PDP-11 is/was a processor/processor architecture (to my knowledge, there was only the one before it went to the way of the dinosaurs) that at one time dominated large scale main frames and dumb terminals at a blistering 4MHz or 8MHz option. Now thats no news, nobody cares about old processors that can't keep up with cell phone processors these days, but one of my computer science professors lived in the binari code of this chip and decided to bring it up in my processor architectures 2 class and went over the rather unique fact that it took all of its opcode instructions in octal, a very alienated way of doing things but since it was a 16-bit microprocessor it actually makes sense, and since my prof LIVES for this thing; it has been brought to my attention that our next assignment will be to code an emulator for it which i am strangely excited about doing. So once that is complete, i will be posting a link to a place you can download my emulator if you have any reason to want to.
... also, here is a link i google'd very quickly but haven't read over, though it looks interesting towards the architecture.PDP-11.org
/me
... also, here is a link i google'd very quickly but haven't read over, though it looks interesting towards the architecture.PDP-11.org
/me
Saturday, February 18, 2006
fscking nUbs
it kills me how many people talk about how their store bought machines or even custom machines dominate so much because they have a 3.8Ghz proc or multiple gigs of ram or some massive hdd when about 85-95% of these people have no clue what they are talking about. it isn't rocket science to put together a computer these days but how many of these "computer geeks" around here who claim to know that they are doing have any idea about their chipset, their north/south bridge, their memory bandwidth or its latency?
....ok, thats my rant ... i'm done.
....ok, thats my rant ... i'm done.
Subscribe to:
Posts (Atom)