While working on the “assignment” assignment, I realized that creating assignments for a course is an involved process. The focus of the assignment I think should be to expose students to fundemntal mechanisms of solving problems, therefore it is better to avoid problems which have multiple problems but all sharing a similar problem solving mechanism. Further, when it comes to this class, I found that the biggest obstacle for me in developing more interest was my unfamiliarity with the linux operating system environment. So my focus in completing the assignment was to come up with something that allows maximum freedom in how its accomplished while also being a useful enough tool to encourage students to use it for their own needs thereby creating one more personal incentive to stay in the linux environment. While I have learned a whole lot about the linux environment, I anticipate I will learn a lot more if I used it for all my computing needs.
I came across an interesting rule by Rob Pike while reading the first chapter of The Art of Unix Programming. “Rule 5. Data dominates. If you’ve chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming”. While there can be numerous ways to solve a certain problem, I always felt solutions that use data structures that are well fit to the problem always have more simpler solutions and are easier to expand in functionality. It felt good to have something I thought was intuitive affirmed but the text went even further in the Rule of Representation by stating “where you see a choice between complexity in data structures and complexity in code, choose the former”. There was no intuitive affirmation or rejection of this statement on my part.
This post is much overdue.
As I worked on the original Arduino-Based Pasteurization Scanner (I will now refer to it as the 1.0), I began to realize that many of the components I was adding could possibly be integrated onto one PCB board. When I first began the 1.0, I really had no knowledge about PCB design, so I sort of pushed the thought off to the side for a good year or so. It just so happened that some of my father’s friends in the cider-brewing community got word of the work I had done, and were possibly interested in having one of their own.
For those of you with less technical knowledge about electrical design, a PCB stands for Printed Circuit Board, which are the “chips” that are inside pretty much of sort of electronic you use daily. They generally contain only the minimum amount of hardware necessary in order to cut down on cost. And since a PCB can be designed once, and printed as many times as the designer wants, it is a good way to mass-produce your design for an electronic.
I soon discovered that PCB design is sort of like carpentry where the mantra of “measure twice, and cut once” rules over everything. All the parts that you intend to use get laid out in software, but the designer has to double check that each part that is necessary in the design matches the physical dimensions in the software, or else the parts won’t fit and a new board will need to be fabricated.
At first, I wanted my PCB to be exactly the same size as the screen I was using, so that the screw holes on the LCD would match the screw holes on my PCB. Then the PCB could just rest directly behind the screen (and when looking straight on, you wouldn’t even know there was anything there). The pins on the right side of the screenshot above match up to the LCD screen inputs, so a vertical header could be used to directly plug in the screen to the PCB.
As I went on in my design, I tried to minimize the amount of traces (which are the electrical connections between parts on the board), and vias (which are when the traces switch from the top side of the PCB to the bottom, so that two different traces don’t intersect). I also added something extra, a WiFi module that prints out data when the user connects to it through a web browser (that’s the big rectangular thing in the upper left on the below screenshot). I also added a 4-axis joystick button, another thermometer port, and a speaker.
Eventually, I scrapped the idea of making the PCB have the same size footprint as the screen and made it even smaller, since I was being charged by the square inch of my design. I also tried to optimize the pins I was using on the microcontroller so that the traces needed could be simpler and require fewer vias.
Once the PCB came in, all I needed to do was solder all the parts on and hope they all fit!
They didn’t. In fact there were a couple errors in my design, which you can’t see since I fixed them on the backside. Also my barrel plug footprint did not match up with the ones I ordered, so I had to break it out with some wires.
I plan to have a complete final design done by the time I graduate.
I have officially moved my blog away from blogs.lt.vt.edu to my own website: hazyblue.me, which eventually will host not only my blog, but also my teaching philosophy, CV and other professional- related tidbits. I hope that everyone will follow me over to the dark-side as I continue to write about education, technology and talking cows. I would especially like to hear feedback on my latest post (inspired by the likes of Janet Murray and Alfred Whitehead), but since I haven’t set up a commenting system yet, please respond via your own blog, a tweet, or an email directly to me!
There have been a couple of blog posts recently referencing the recent switch NASA made from Windows to Debian 6, a GNU/Linux distribution, as the OS running on the laptops abord the International Space Station. It’s worth noting that Linux is no stranger to the ISS, as it has been a part of ground control operations since the beginning.
The reasons for the space-side switch are quoted as
…we needed an operating system that was stable and reliable — one that would give us in-house control. So if we needed to patch, adjust, or adapt, we could.
This is satisfying to many Open Source/Linux fans in it’s own right: a collaborative open source project has once again proved itself more stable and reliable for the (relatively) extrodinary conditions of low Earth orbit than a product produced by a major software giant. Plus one for open source collaboration and peer networks!
But theres another reason to be excited. And it’s a reason that would not necessarily applied (mostly) to, say, Apple fanatics had NASA decided to switch to OS X instead of Debian. And that reason has to do with the collaborative nature of the open source movement, codified in many open source licenses under which the software is released. Linux, and the GNU tools, which together make up a fully functional operating system, are released under the GNU General Public License. Unlike many licenses used for commersial software, the GPL esures that software licenses under its terms remains free for users to use,modify and redistribute. There are certainly some strong criticisms and ongoing debate regarding some key aspects of the GPL, especially version 3, the point of contention mostly lies in what is popularly called the “viral” effect of the license: that modified and derived work must also be released under the same license. The GPL might not be appropriate for every developer and every project, but it codifies the spirit of open source software in a way that is agreeable with many developers and users.
So what does this all mean in terms of NASA’s move? We already know that they chose GNU/Linux for its reliability and stability over alternatives, but that doesn’t mean it’s completely bug free, or will always work perfectly with every piece of hardware, which after all is another reason for the switch: no OS will be completely bug free or always work with all hardware, but at least Debian gives NASA the flexibility of making improvements themselves. And there in lies the reason for excitement. While there is no requirement that NASA redistribute their own modified versions of the software, there is no reason to assume they wouldn’t in most cases, and if they do, it will be redistributed under the same license. It’s certainly realistic to expect they will be directing a lot of attention to making the Linux kernel, and the GNU tools packaged with Debian even more stable and more reliable, and those improvements will make their way back into the general distributions that we all use. This means better hardware support for all GNU/Linux users in the future!
And of course it works both ways. Any bug fixes you make and redistribute may make their way back to the ISS, transforming humanity’s thirst for exploring “the final frontier” into a truly collaborative and global endeavor.
First off, GTA is one of my favorite and I do think its the best out of those sandbox games. So this “GTM” project was the very first game I decided to review. Since its a text-based game, I didn’t expect to see too many text-based images in the gameplay, and drawing those images using symbols would take just too long considering they are going to come up with and implement a story line. But what has impressed me is that they did have plenty of graphics implemented and they all look wonderful while still having a great story line that contains four challenging missions. Also, the source code is well organized and commented. Everything is named intuitively so I was able to find and hack the weapon damage in 2 minutes :P. The game does need some polish on balancing and gameplay experience. However, we are not expecting a group of professional game designers here.(And they did fixed most of the issues that I posted on github) So in programming perspective, they have done a wonderful job within couple of weeks.
First of all, I like this game and I think this game is done very well. The code looks intuitive and efficient. It is a good idea to create different classes for game ball and bricks — good manner for object oriented programming. I agree with you guys on that using Qt and its QWidget is a easy way to implement it. I’ve used Qt for so many projects and I did believe it’s a great tool to build programs with simple GUI or even small games. But I do think Qt is way too large to install for those guys who don’t use it regularly. The game itself looks good in spite of some trivial glitch such as it is possible to knock the ball through the wall in some extreme situation. Overall its a great game with smooth game play but I do think you guys can work a bit more on graphic and key input logging. I do realize that it is hard to solve the key pressed delay since all key inputs are handle by OS and key press will be interpreted as a sequence of repeated key hits. However it is especially uncomfortable for this kind of game that requires fast reaction of the player.
Imagine that your multi-core android phone becomes a real PC. Ubuntu is now developing a ubuntu desktop variant for android phones. It is not a simplified or portable version of ubuntu system remade for cell phones. The ubuntu for android will be the real ubuntu system running on your phone.
If this upcoming feature is finally introduced, your phone will be able to run Android and Ubuntu at the same time. This means no dual-boot or virtual machine is required, they are running simultaneously since both Android and Ubuntu use Linux kernel. For more information, check out: http://www.ubuntu.com/phone/ubuntu-for-android
Finally the midterm is over and I felt I did well on it. This was one of my favorite types of exam that I’v ever had. Not because the test itself is not brutally difficult but the test provided a quite good reflection of what skills we have learned in this class. We had to practically write python codes, bash commands and some regex expressions in a certain amount of time. Even the test submission is through the git. That’s what I thought a programming class should be, but so many of them have failed. They ask you to write up bunches of paragraphs to explain a single concept instead of testing your actual coding abilities. I finished this test in an hour and 20 minutes but I double checked every problem very carefully so I think you all should be able to finish the test on time. Anyways, this is a great test and hope you all get a good score on it.
Are you tired of typing every single git command into terminal? You may want to check out SmartGIT on its website: http://www.syntevo.com/smartgithg/index.html
This is program that allows you to manage your git directory in a GUI environment. It supports all platforms (windows, Mac and Linux). However you can only use it for free for 30 days.
Here’s another git GUI app called git-cola. This one is completely free to use, and I would recommend this if you are looking for a real compact git GUI with ease of use.