We have decided to make each room a folder. However, instead of having folders in folders for our room layout, we will have all the room folders in one subdirectory. We have also decided to use multiple files in each folder for the different actions to use. We have split the project up amongst ourselves. Two people will be coding in python and two people will write the room descriptions, action files, etc. We already have a basic implementation of the project, all we need to do is add some actions, add the room descriptions, and action files.
Category Archives: linux
Final Project and Group Meetings
In the past two weeks, I have met with my final project group twice. We have decided to create a text based adventure game for The Walking Dead. The structure of the game will be based on the Linux filesystem. Each folder will be its own room, with the different directions you can move. For instance the parent folder will go back and child folders to go left/right. Items and other actions will be additional files is the folders. Instead of having the entire game text be in the main python program, we have decided to break it up. We have come up with two ways to do this.
- The room description and items have their own text file that the python script will read from. In this case, the program would only need to know its folder location and any items picked up. This would make it easy to create a save game addition.
- Every folder has its own python script. In this case, the program would only know the folder location and to call the python script of the folder and wait. The folder python script would contain the room description as well as all actions and items. This would make saving the game be more difficult and would add complexity in keeping the inventory from room to room.
Both cases have different complexities, however, I personally think that case 1 would be the more elegant way to implement the idea.
If anyone has another idea on implementation, feel free to comment.
But What Does It All Mean?
We are at that point in the semester where we are all feeling a bit overwhelmed. We’re past the introductory material, covered a lot of new concepts and ideas, there’s the threat of a midterm on the horizon, and to make matters worse the same thing is happening in every other class as well. It is not surprising that this is also the part of the semester in which I get questions like “why are we learning this?”, “how do these assignments teach me about Unix?”, or “we’ve covered a lot of commands, a lot of new ideas, done several assignments, what should I focus on?” or “how does it all fit together?” These are good questions to ask, and a good exercise to answer, both for me, so that I can better shape the course to meet its goals, and for the participants in the course to solidify the connections between different material and concepts. I will attempt to start that process by providing my own thoughts addressing these questions: Learning Unix is as much (if not more) about learning a different way of thinking and problem solving with a computer as it is about learning how to use the terminal and acclimating yourself to the different GUI environments. And while we’re on the topic of user interfaces, there are several.
Rule of Diversity: Distrust all claims for “one true way”
Unlike Windows or OS X which provides only a single graphical environment to its users and a single graphical toolkit to its developers, Unix systems have multiple options for both environments (e.g. Ubuntu’s Unity, GNOME Shell, KDE) and toolkits (GTK, Qt are the most common). This confusing jumble isn’t just to make it needlessly annoying for users, in fact it is the result of one of Unixes core philosophies: the users has a better idea of what he/she wants to do than the programmer writing an application. As a result, many decisions are pushed closer to the user level. This is sometimes listed as a downside to Unix/Linux, since it increases the perceived complexity of the system to the user, but luckily many distributions have been set up to select a sensible default choice for most people who would rather jump right into using their computer, rather than configuring it. Ubuntu, with its Unity interface, is a good example of this. But it’s empowering to be aware that you aren’t stuck with Unity, you can install and use any of the other graphical environments as well, such as GNOME, KDE, LXDE, and more. Moving on, but keeping The Rule of Diversity in mind, let’s revisit the examples we looked at in class. The first was related to the homework in which we were asked to read in lines in a white-space delimited record format containing fields for “First Name”, “Last Name”, “Amount Owed”, and “Phone Number”. A short example of such a file is
Bill Bradley 25.20 Blacksburg 951-1001 Charles Cassidy 14.52 Radford 261-1002 David Dell 35.00 Blacksburg 231-1003
We were then asked to write a program that would print out information for people in ‘Blacksburg’ in the order of “Phone Number”, “Last Name”, “First Name”, “Amount Owed”. A straight forward way to solve this using Python is with the following code snippet
for line in f: fields = line.split() if fields[3] == 'Blacksburg': record = [fields[4], fields[1], fields[0], fields[2]] print ', '.join(record)
In class we looked at an alternative solution using list comprehension:
for fields in (r for r in imap(str.split, f) if r[3] == 'Blacksburg'): print ', '.join(fields[i] for i in [4,1,0,2])
Both of these examples can be found on github. They both do the same thing. The first takes 5 lines, the second 2. I made use of a few convenience features to make this happen, the first is the imap function, the iterator version of the map function. The map function is common under many functional programming languages and implements a common task of applying a function (in this case str.split) to every element in a list (in this case f, the file object). This is an extremely common task in programming, but there is no analog in C but luckily the STL Algorithms library gives us std::transform for C+, though the syntax isn’t nearly as clean as Python’s. So the big question is “If I’ve been implementing this idiom all along, without ‘map’, why change now?” The answer is that implementing it without map will be guaranteed to use more lines of code, with which we know there is a statistically higher chance of making a mistake. In addition, the implementation would look a lot like any of the other loops that you have written in the same program and you will find yourself pausing at it to ask yourself “What am I trying to do here?”. Once you learn the concept of ‘map’, using it is much more concise. Looking at the call to “map” you know exactly what is going on without having to mentally process a “for” or “while” loop. This idea is generalized to the concept of list comprehension, which is what we’re doing with the rest of that line. Working with a list of things is really common in programming, and one of the common things we do with lists is to generate new lists that are some filtered and transformed version of the original. List comprehension provides a cleaner syntax (similar to the set comprehension that you may be familiar with in mathematics) to transforming lists than the traditional “for” or “while” loop would yield. And more importantly, once you get familiar with the syntax, it lets you more quickly recognize what is going on. For example, let’s look at two ways of computing a list of the Pythagorean triples for values 1 through 10
triples1 = [] for x in xrange(1,11): for y in xrange(1,11): for z in xrange(1,11): if x**2 + y**2 == z**2: triples1.append((x,y,z)) print triples1
and now, using list comprehension:
triples2 = [ (x,y,z) for x in xrange(1,11) for y in xrange(1,11) for z in xrange(1,11) if x**2 + y**2 == z**2 ] print triples2
I’ve broken the second example across several lines so that it will all fit on the screen, but it could be left on a single line (see the full, working example) and still be just as readable. Right off the bat we can look at the second version and tell that `triples2` will be a list of tuples containing three values (x,y,z). We had to work our way down to five levels of nested blocks to figure that out in the first example. And while you may not realize it because you’re so used to doing it, our brains have a much more difficult time following what is going on in a nested loop, it implies a specific hierarchy that is misleading for this problem. Let’s shift gears just a bit and look at some of the commands I ran at the end of class. I first wanted to count all the lines of code in all of the *.py files in my current directory:
cat *.py | wc -l
Then I wanted to revise that and filter out any blank lines:
cat *.py | sed '/^$/d' | wc -l
And let’s also filter out any lines that only contain a comment
cat *.py | sed '/^$/d' | sec '/^#.*$/d' | wc -l
(note, we could have combined the two `sed` commands into one, I separated them to emphasize the idea of using a pipeline to filter data) Next I wanted to know what modules I was importing.
cat *.py | grep '^import'
Say I wanted to isolate just the names, I could use the `cut` command
cat *.py | grep '^import' | cut -d ' ' -f 2
If you didn’t know about the `cut` command you could use sed’s `s` command to do a substitution using regular expressions. I will leave the implementation of this as an exercise for the reader.
We notice that there are few duplicates, let’s only print out unique names
cat *.py | grep '^import' | cut -d ' ' -f 2 | sort | uniq
Exercise for the reader: why is the `sort` necessary?
And finally, let’s count the number of uniq modules I’m using
cat *.py | grep '^import' | cut -d ' ' -f 2 | sort | uniq | wc -l
I could have just shown you the final command and said “this prints the number of modules I’m using” but I wanted to demonstrate the thought process to get there. We started with just a two command pipeline, and then started building up the command one piece at a time. This is a great example of another core Unix philosophy: write simple programs that do one thing and one thing well, and write them with a consistent interface so that they can easy be used together. Now I admit, counting the number of modules uses this way required us to start up 6 processes. Luckily process creation on Unix systems is relatively cheap by design. This had the intended consequence of creating an operating environment in which it made sense to build up complex commands from simpler ones and thereby encouraged the design of simple programs that do one thing and one thing well. We could write a much more efficient program to do this task in C or another compiled language, but the point is, we didn’t have to. As you get more familiar with the simple commands you’ll find that there are many tasks like this you want to do that occur too infrequently for writing a dedicated program, but can be pieced together quickly with a pipeline.
So what the heck do these to different topics: list comprehension and command pipelines, have in common? And why are we using Python at all? Well, Unix’s strength is that it provides a huge wealth of excellent tools and supports a large number of programming languages. It does everything an operating system can do to allow you, the developer, to pick the best tool for the job. As we mentioned before, when we’re developing a program the “best tool” usually means the one that will allow us to solve the problem in the fewest lines possible. Python’s syntax is much cleaner than that of C or C++, and its support of convenience features like list comprehension allow us to implement algorithms that might normally take several loops in a less expressive language in one, easy to understand line.
This has been a rather long post, I hope you’re still with me. To summarize, don’t worry too much about memorizing every single command right away, that will come naturally as you use them more often (and a refresher is always just a quick call to `man`). Instead shift your thinking to a higher level of abstraction and always ask yourself “what tools do I have available to solve this problem” and try to pick the “best” one, whatever “best” means in the context you are in. Unix/Linux puts you, the user and you, the developer in the drivers seat, it provides you with a wealth of knobs and buttons to press, but does little to tell you which ones it thinks you *should* press. This can be intimidating, especially coming from a Windows or OS X environment which tends to make most of the choices for the user. That’s ok, and to be expected. With practice, you will learn to appreciate your newly discovered flexibility and will start having fun!
I want to know what you think! Share your thoughts on what we’ve gone over in class, the assignments we’ve done, and the reading we’ve discussed. How do you see it all fitting together?
PXE Boot Adventures
Setting up the server was not all that difficult, but there were a few setbacks. I started with a 64-bit Debian VM on my laptop with atftp-server and followed a basic tutorial I found online. Setup for this was fairly straightforward for the more common distributions with text installers- -you only needed to mount the netinstall ISO, copy the files to the TFTP root (/var/lib/tftp in my case) and make a bootloader entry pointing to the kernel for each OS.
Unfortunately, this did not always work. Distros like Ubuntu with fancy graphical installers could not be served entirely of of TFTP and required an NFS share for the rest of the content on the ISO. This was fairly easy to do with a "nfsroot=192.168.0.154:/srv/nfs" string appended to the kernel line in the syslinux configuration. Eventually, I was able to get most common distributions (Ubuntu, CentOS, Fedora, Debian, etc) up and running on my VM, but Arch would not boot.
Since I needed a PXE server for another organization, I had everything up and running a few weeks in advance, and just rsynced all of my data to another Debian x64 server on campus. I figured I'd just open the TFTP port for the duration of installfest so everyone could boot remotely. The night before installfest, I found out that TFTP booting would not work at all over NATs, becaus e the UDP ports used were chosen randomly and therefore can't be forwarded in advance.
With less than 12 hours remaining, our options were limited to:
- Setting up an iPXE server and handing out USB drives
- Making another PXE boot server
- Hauling a server across campus and up serveral flights of stairs
- Learning IPsec and setting up a point-to-point VPN
- Using the original (albeit outdated) VM on my laptop
Installfest was a much larger success than in years past, probably due to our promotion at Gobblerfest and spamming of all the listservs. We ended up with 27 successful installations in a few hours, mainly composed of Fedora, Ubuntu, and Arch Linux. Less popular distros included Sabayon, OpenBSD, FreeBSD, DragonflyBSD, and Rebecca Black Linux (yes, that's a thing).
Linux! It’s fun!
Except when it isn’t. And even then it’s fun!
I got a replacement CPU, motherboard, and RAM (also an SSD!) for my 4-year-old desktop. Still no news of my Thinkpad’s ongoing repairs, but I’m pretty sure having a desktop will suit my needs for now.
I went ahead and set up a dual-boot environment with Windows 7 and Arch Linux. There was a time when this would have been easy for me, but I’m a little out of touch with computer-building, and the Arch setup process has gotten slightly more complex than it once was. So I initially forgot to set the RAM timing properly, which caused a bunch of problems with Windows.
Once I’d spent a day or so figuring that out, I decided to set up the linux side. I had used the SSD for my base Windows install, and it’s crazy-fast. So I plugged in the Arch install disk, which used to come with the Arch Installation Framework (basically an install wizard). They’ve recently removed it from the install media, so I was a little confused. I didn’t want to screw anything up, and I didn’t have a ton of time to spend learning to manually partition my hard drives using cfdisk or parted, so I burned a Gparted disk image and did it a nice, safe GUI. Oh, except that one of my older hard drives started screeching at me like a tiny metal banshee. I suppose it’s what I get for using several-year-old hardware.
So I install Arch and GRUB2, and finally everything is going swimmingly. I’ve set up internet printing using CUPS for the first time, so no more messing with my printer’s USB cable when I want to print from a laptop. I’ve also set up my desktop as a dynamic dns client using freedns.afraid.org, so I can hopefully remote-access it even when CNS changes my IP (which they’ve started doing in the past couple years. Gross).
Oh, another thing: Somewhere along the line, I tried installing Ubuntu via Wubi, and was weirded out at the output of `df -lh`. What exactly is a loop-mounted partition? This requires further study.
First impressions of Ubuntu
So far, Ubuntu seems to be pretty much like a mix between Mac OS and Windows. There is a program dock on the left side, and familiar windows open when I click on them. A big difference I have noticed so far is that the directory goes back to home instead of desktop.
In class we had a group exercise of trying to teach each other about 35 basic command line commands. I thought the exercise was a bit of a hassle, considering some people didn’t really do any work before hand. Also, not enough time was really given to really learn all 35 commands well in 20 minutes.
Ubuntu: First Impressions
First, I would like to state that Ubuntu is not the first Linux operating system (OS) I have used. I do have some experience with different versions of BackTrack (BT). Before installing Ubuntu 12.04 LTS, I considered learning BT more in-depth for a class I am taking at Virginia Tech (Intro to Unix for Engineers). However, after installing Ubuntu I decided to stick with it. This was mainly due to the fact that Ubuntu, for me, is a more well designed, well rounded OS. BT is geared more towards the computer security workforce, whereas Ubuntu is more of an Average Joe’s OS.
After booting into Ubuntu, you come to a beautiful login screen. After typing in my password and pressing enter, my desktop seamlessly pops right up. No wait text with a circle next to it, no black transition screen, and no lag to start up a program. The first thing I notice are all the essential programs are installed by default. Programs similar to Microsoft Word, Excel, Power Point, ect are already installed and on your dock.
Some things worth mentioning:
- Even if you don’t like to write command lines, Ubuntu is still an excellent Windows / iOS alternative.
- Any program you could possibly want is probably located in the Package Manager (similar to an app store but for programs and more). Which means less time searching for programs and you have one place to go to update all your software.
- Most viruses are designed to attack Windows machines. Avoid viruses be switching to Linux.
Blogging for UNIX!
I need a blog to replace my posterous (given their seemingly imminent demise), and in particular I’d like to write about Unix for a class I’m taking, in which blogging is an optional assignment. All posts on this new blog with the tag ‘ece2524’ will pertain to my experience with this class. The laptop I’ll be using is currently on its way to IBM for repairs, so in this post I’ll just outline my experience with Unix-like operating systems.
I first installed Linux in freshman year of high school (2006) at the urging of Clark Gaylord, father of my co-boyscout and friend Carter, when I took over the troop’s newly-created webmastership. I used SUSE9, because he happened to have a book and liveDVD to lend me. None of the laptop’s networking hardware, USB mass storage, windows partitions, or optical drives were recognized. Needless to say, I gave up pretty quickly and Clark showed me how to use PuTTY and WinSCP to connect to the server instead.
During this time I was an avid reader of the site LifeHacker, which was enamored of the up-and-coming Ubuntu Linux distribution. This distribution’s goal was to make desktop linux as easy as other operating systems. On booting the liveCD, everything seemed to work! I was soon spending most of my time playing in the linux side of my dual-boot setup.
By the summer I was very comfortable using Linux as a desktop operating system, and I got a job as a help desk technician and IT drudge at the Virginia Tech Transportation Institute, where Clark was the CTO. This job involved a lot of waiting for progress bars—time that I used (encouraged, of course, by my boss’s boss) to learn about linux culture and programming, and gain comfort with the command line. By the next summer I knew enough to be hired back at VTTI, this time as a junior system administrator.
Since those two summers I’ve used Unixen for nearly everything. I’ve used Ubuntu, CentOS, Fedora, Debian, and Arch Linuces; Free, Open, and NetBSDs; and, of course, OS X. I’m no longer comfortable on a system without a bash or zsh prompt.
In summary: Linux has been a major part of my adolescence, and will probably continue to be a major part of my adulthood.