nxfury

Musings of a *Nix Nerd

Odds are you might have seen this guy around on the internet: Tux The Penguin

This cute little penguin is actually the official mascot of Linux- one of the leading Operating Systems in existence, deployed on approximately 85% of all devices in the world. Due to it's stability and compatibility, the enterprise world craves Linux and loves it... What always made Linux unique?

About The Open Source Movement

in the 1970s until the mid-to-late 1990s, Open Source Software was termed “Free Software”, in the sense that source code or schematics were indeed free to the public to take a look at, modify and verify the legitimacy of products they purchased.

Folks who were never familiar with this might have heard of dedicating research work to the public domain for a clearer picture of what still goes on to this day.

Why Many Geeks Despise Paid Software

To understand why so many Linux and BSD lovers have an avid hatred for Windows and Mac, we shall backtrack to the history of the Apple computer. Waaaaaaaaaay back when dinosaurs roamed the earth- just kidding, cavemen did. Wait, wrong era. Back in the mid-1970s, a man named Steve “Woz” Wozniak would go on to invent, design and prototype the first Apple Computer. “The Woz” would go on to term it the “Apple”:

This computer was way ahead of it's time in the sense that it was the first computer that an individual could actually do work on, as most computers were prohibitively expensive, were oversized, or were just simply not user-friendly. Wozniak gave his first demo of the device at the Homebrew Computer Club, where an excited buzz filled the air as budding hackers got their own schematics sheets to build their own and attempt to run software on it. Among the members here was Bill Gates. According to former members, he appeared curious, but later on down the road he would go on to send this alarming letter:

Bill Gates Open Letter to Hobbyists

At the time, the developers who were working on furthering research made use of Altair BASIC, as it was in vogue at the time- an invention of Bill Gates and sold at astronomically high prices by some company named Micro-Soft. Naturally, people would copy the tapes and redistribute it. But why would Bill Gates go after people furthering research? To this day, geeks and open source advocates share a dislike for Microsoft simply due to the company's past stance on Open Source and Public Domain Research- They simply don't trust Microsoft. Apple isn't much better, as Steve Jobs would later go into business with Wozniak and alienate him from his own invention.

UNIX

Fast forward a few years and AT&T's Bell Labs would go on to invent the UNIX Operating System and the C Programming Language. Due to the sheer capabilities of UNIX, as it was written in C and not Assembly (like all other systems at the time), AT&T decided to market it. Adjusted for inflation, the cost for a copy of UNIX was approximately $10,000 USD, which included the source code and the system installation disks. The innovation of the system quickly meant that it took over in Universities and Military applications.

One of these universities was the University of California at Berkeley. They took their copy of UNIX and rolled their own tools into it, calling it BSD. They would go on to redistribute the source code for free, resulting in a lawsuit that AT&T lost- effectively giving the Open Source movement a jumpstart in the 1980s and 1990s.

Linux

During this lawsuit, a man named Linus Torvalds released a free kernel to the public, inspired by the UNIXes he wanted to have at home. He called this Linux, and development STILL continues to this day and the system is highly regarded amongst developers for it's stability and compliance with known standards (like POSIX).

Why Does All This Ancient History Matter?

Well... It's not ancient... Anyway, how is someone supposed to trust companies founded on the principle that dedicating source code to the public domain is a bad thing, and sharing of ideas for research and innovation is inherently bad?

Also, it's important to note that these companies still attempt to invade user's privacy to sell data to the highest bidder, they still use their customers as “guinea pigs” by rolling out updates for their unstable release software, and they still prevent public access to the source code so no one can attempt to fix these problems.

What makes users THINK they will change?

Alright, So What Should I Do?

If you are opposed to such unethical practices, then consider leaving Windows and Mac out of protest cease to use their products to avoid giving them money. After all, money speaks more than the mouth.

Also, if you support what the Open Source Community does consider participating or making your next personal project under an Open Source License: Choose An Open Source License

Finally, if you support certain open source projects definitely consider participating in them to assist in their growth. There's so many projects that are in desperate need of developers, artists, authors and more.

If you enjoyed this content, be sure to subscribe, leave a comment and tell your friends!

Read more...

Due to the sheer enjoyment of writing about enabling support for old Broadcom cards back in the day, it's time to share another horror story- Another fiendish story of what caused me to leave Fedora Linux, never to return. For the Dante afficionados, let's enter the proverbial “9 Circles of Dependency Hell”.

But I Just Wanted to Play Quake 3!

Too bad, so sad. I had the official Return to Castle Wolfenstein CD and the .run file to install the data to my Linux system, and wasn't aware of what it's dependencies were. For those who are unaware, a dependency is a bit of software that is needed by your program to run. Oblivious to what was needed, I mounted the CD and ran the installer. Little did I know that I would be in for days of work. The game launched and I was enjoying the WW2 Prison-Breakout glory of Wolfenstein.

No Games For You

So Fedora uses a package manager to aid in installing updates to software. The hitch is that it used to (and may still) upgrade everything without ensuring the possibility of being able to revert back to the previous state. My game depended on old versions of software to work, unbeknownst to me at the time.

I had automated the installation of updates for once a week and forgot about it months prior. Little did I know that the following Sunday morning, I would not be able to launch Wolfenstein because it was missing critical libraries.

Enter Dependency Hell

After doing some research about the packages I needed to launch Wolfenstein, I wound up downloading the rpm files that were of the correct version since I had known about what dependency hell was previous to this fiasco. But would it strike me? No, I was a sysadmin– I knew my way out of this! Squeezing my stuffed Tux- I mean penguin- I proceeded to install the rpm files using the rpm -ivh command. Little did I expect what would happen next...

Even the rpm package installer removed the pre-existing binary that I updated! So now, I couldn't launch my File Manager, VLC, LibreOffice, or GIMP- apps that I used regularly began crashing.

I ran yum -y update to revert the downgraded software and then Wolfenstein wouldn't start again. This is why I can't have nice things...

Doing The Unspeakable

I wanted to play Wolfenstein bad at this point, and realized I hadn't tried compiling from source with the old version. This involves taking the source code, performing some voodoo magic on it, and producing a binary. Generally, this is not supported by package managers and as a result is often ignored. So I thus embarked on a saga of making my Core 2 Duo (at the time) CPU scream bloody murder.

Tell Me More About Compilation!!! Fine... It's not voodoo magic, GNU Make automates the execution of various compilers in a specific order. A compiler in a nutshell is just a program that translates source code into another language. In most cases it translates into machine language- producing binary programs. Compilation is CPU-intensive, and just great for warming a home in the winter.....

Highway to (s)Hell

So after taking inventory of the various versions of software I needed to compile, from libSDL to Xlib, I started downloading the specific versions of source code for each application I needed. 2-3 hours later, I had a folder full of source code. Since I wanted this project over with, I used a while loop in bash to automate the extraction of all the tarballs into their own separate folders.

“You Could Roast a Marshmallow on That Thing!!!”

I made the willful choice of compiling the software I needed from source, and was going to see my project to completion. I would proceed to enter the first folder, run ./configure to generate the Makefile custom tailored to my hardware, then run make, followed by make install. For each application, this took about 30-45 minutes given the speed of the CPU, and the majority of the time was spent waiting on the computer to finish it's prescribed suffering- I mean compiling.

By the time I was done compiling, it was Tuesday morning, and my laptop was so hot I had plugged in an external keyboard and mouse to use it, with the device propped up on 2 textbooks to retain airflow for cooling.

Wasted Time

By the time I was completed, I grabbed a bag of my favorite chips- Jalapeño Flavored- and launched Wolfenstein. To a nerd's delight it launched and I muttered “IT'S ALIIIIIIIIVE!!!” to myself. When I began playing the game, however, my excitement was rendered useless- the audio was stuttering and the video was horribly choppy. I didn't meet the RAM requirements and would have to wait for an upgrade... :(

The Upgrade

In just a couple days, the RAM arrived!, I tore open the packaging and quickly added the RAM to my laptop. The game finally worked, in all it's glory! I was running, shooting 'em up, and defeating Nazi officers in a valiant attempt to save the world. However, the power brick that charged my laptop wasn't powerful enough to charge the device with the added RAM, so I waited on a close friend to snag me a spare charger from his old job, as they were closing the office and liquidating old hardware (I would later receive my second laptop from this closure).

Multiplayer Not-So-Awesomeness

As it turned out, multiplayer on Return To Castle Wolfenstein was widely considered one of the best parts of the entire game. So naturally, I wanted to try it out. However every time I attempted to join a multiplayer server I would get errors about “PunkBuster not working”. On Linux at the time, there was a lack of documentation on how to resolve this issue. I tried modifying the PunkBuster configuration, to no avail. On top of this, I even reinstalled Wolfenstein. Still, I couldn't play multiplayer until a patch for the game was released. I missed out on the peak of the multiplayer action because of this.

Nowadays, the multiplayer servers are down and the game is widely considered a good old game- one of the best video games ever made. And I missed my shot at enjoying it while it was still fresh because Fedora just wouldn't play nice. As a result, I wound up leaving Fedora Linux, never to return. Back then I left for Ubuntu but would later migrate over to Arch Linux. Little did I know I would even leave that for Gentoo, Slackware, and BSD for security... Which would later become a passion of mine.

I hope you enjoyed this content! If you do, following my twitter at @nxfury01 or subscribing to my email list at the bottom of this page will notify you every time I release a new post. Thanks for your support!

Read more...

Cryptography– it's always the hot debate topic regarding computers, with society trying to perserve it and ensure ciphers are extremely hard to crack, to aid in the preservation of privacy (thus ensuring free speech). Governments often oppose cryptographic ciphers because of their difficulty to crack, making investigations and research on other people harder.

However, there's no denying that such systems seem very arcane and tough to understand, and this series of posts intends to shed some light on how cryptographers implement systems that are extremely hard to crack.

This post series exists to help educate people on the importance of cryptographic research and how it corresponds to your privacy online, and how you can better protect yourself in a high risk environment. I would like to give credit where it's due, as I learned most of this content from “Applied Cryptography” by Bruce Schneier.

Like a Lightswitch: Boolean Logic

So let's quickly cram an intro to computer science class into a couple paragraphs to preface this all... Boolean Logic is just a fancy term for the ability to do math with nothing more than true or false statements and a few special operations. This is achieved through the use of the binary number system, which behaves very much like the decimal system in the sense that it has a “place” for digits of a certain value. However, instead of having a 1s, 10s, 100s, etc. place, the binary system has a 1s, 2s, 4s, 8s, etc place. A binary digit is called a bit and a number that is 8 binary digits long is called a byte.

Like normal math, we can do addition, subtraction, etc to the binary numbers... But we can do more than that since binary 0 is “False” anything other than 0 is “True”. We can use AND, OR, XOR, NAND, and NOT operations on our numbers now. AND, OR & NOT are all pretty self explanatory in how they work (they take inputs and you perform said operations on them). NAND stands for NOT AND, so you basically perform AND and then invert the output value. XOR will only output true if only one input is true.

Cryptography Basics

So what is cryptography? In the most perfect sense, a cryptographic function is an algorithm that can only be reversed using one method, and is impossible to recover the original contents using any other method. However this is often not the case, and this is why security experts say nothing is 100% secure, because there will always be unknown holes in your cryptographic functions and systems.

When cryptographic functions work through taking a message and a single “key”, performing a series Boolean operations and mathematical operations to use the same key to encrypt and decrypt the message, it is called a symmetric encryption algorithm. Some of the leading symmetric algorithms (in terms of security) are AES-256, CHACHA20 and SALSA20.

If there's 2 keys, one for decryption (called a private key) and one for encryption (called a public key), it is called an asymmetric encryption algorithm. Some of the leading asymmetric algorithms are RSA and EC-Diffie Hellman.

Finally a hashing algorithm is one that takes a message as input, performs a series of operations on it, and outputs a bunch of garbled information- but if you input the same message again, you will get the same output. This is common for storing passwords and login information. Common hashing algorithms are SHA256 and SHA512.

Keys require random numbers to be created, and often times cryptographic systems rely on programs to generate random numbers for keys. The ongoing problem is that computers are incapable of being random, so there is ongoing research to produce Cryptographically Secure Pseudo-Random Number Generator software (CSPRNG). Alternatively, some people opt for Hardware-based Random Number Generators (HRNG) for producing their crypto keys.

Planning our Cryptosystem

Let's say Bob and Alice want to email each other, but they fear Eve- our eavesdropper- might be listening in. How can we securely share secret cryptographic keys in such a manner that it's impossible for Eve to get them?

Using Multiple Systems

Using some code, it's entirely possible to stitch together multiple algorithms. So it's possible that we could send EC-Diffie Hellman encrypted messages, but encrypt our public and private keys with AES-512 encryption and a personal password. So it's not theoretically possible for “Eve” to intercept the encryption and decryption keys without having to trick Bob and Alice.

To do this, we need to understand what EC-Diffie Hellman keys go where. The public key encrypts the message, while the private key decrypts the message. So for this to work, Bob would need to have Alice's public key and his private key encrypted with AES-512, while Alice would need Bob's public key and her private key encrypted also with AES-512.

To simplify this... 1) Bob and Alice generate public and private keypairs 2) Bob and Alice swap public keys. 3) Bob encrypts Alice's public key and his private key. 4) Alice encrypts Bob's public key and her private key. 5) When they wish to email, they unlock their keys. 6) After unlocking their keys, they encrypt their messages. 7) To decrypt the message, Bob or Alice unlocks their keys. 8) They then use their private key to decrypt the message.

This seems rather complex, although most of the process is automated and running behind the scenes. Software like this would manifest itself as a “keychain” or “keyring” in major programs.

The Plan

The first step, which will be shared in the next post, will be to implement a CSPRNG and a hashing algorithm so we can generate keys.

The second step will be to implement a EC-Diffie Hellman cryptographic function, using hashing algorithm and CSPRNG to aid in the generation of keys.

The third step will be to implement AES-512, which will complete the cryptosystem, and allow for encryption of the keys.

The last milestone of this project will be to provide a simple and clean interface so an end-user can encrypt their emails.

References

Schneier, B. (2015). Applied cryptography: Protocols, algorithms, and source code in C. Indianapolis, IN: Wiley.

Read more...

Once upon a gloomy day, an innocent programmer (innocent? yeah, right...) stared at his Linux terminal in dismay only to find that the wifi card he installed wasn't supported, and he threw out the old one. This tale of woe documents my actual misadventures with the Linux kernel back in the days of Linux Kernel version 2.6 or so in 2004.

Tell-tale signs

I remember getting my old 2007 Dell XPS right when it came out from a third party seller and how he swapped the included WiFi card with one that was absolutely horrible for the time. Since a replacement card was cheap, I invested in a replacement card for the laptop, which was a Broadcom card.

After a bit of waiting and checking the mailbox constantly, it had arrived and I excitedly popped open the laptop and inserted the card... And threw out the old one- welcome to hell...

I booted my Ubuntu installation, complete with wobbly windows, and fired up bash and excitedly ran ping google.com. The happy smile quickly turned to an analytical frown, wondering why this could be. Running ifconfig didn't list the new wlan card either...

Google-Fu

Since I knew this was a Broadcom WiFi card, I plugged into Ethernet (which thankfully worked) and began a massive googling spree. After a couple hours of searching for “Broadcom Wifi not working Linux”, “Linux Broadcom support”, and so on it was discovered that I needed to utilize a package called ndiswrapper, which effectively allows the loading of Windows XP firmware for wireless card to be loaded under Linux through a wrapper.

NDISWrapper Hell

Excited that there was a solution, I downloaded the firmware for Windows XP and installed the ndiswrapper package. After adding blacklist bcm43xx blacklist b43 blacklist b43legacy blacklist ssb to /etc/modprobe.d/blacklist.conf, I was ready to install the driver. I installed the driver in a Windows XP virtual machine and obtained the .INF file that corresponded to the broadcom card. From there, I believed it to be a simple sudo ndiswrapper -i broadcom.inf to install the driver.

But lo and behold, the driver wasn't written for my CPU Architecture, and the installation failed every single time! Out of desperation, I even experimented with QEMU to see if I could just emulate the driver, to no avail.

After 3 days of banging my head against a desk, rebooting, restoring from a backup and more, I gave up entirely on NDISWrapper and turned back to Google.

The Discovery

After a couple more hours of Google-Fu, I stumbled upon a discovery- A developer was working on a patchset for the same WiFi card I had, and these changes weren't in the kernel! For those who are unaware, patchsets are groups of .patch files you can run the patch command on, to modify source code to look like what the developer made. Excitedly, I downloaded this and a fresh copy of the Linux Kernel source code.

Kernel Games

With everything downloaded, extracted the Linux kernel source code and cd'ed into it. Following this, I ran patch < 1.patch over and over again, but changing the name of the path of patch until I had applied the entire set. Then I executed cp /boot/config-uname -r.config to copy the stock Ubuntu kernel config to the .config file required for compiling a new kernel. After this, it was just a matter of running make menuconfig to customize the kernel and enable the Broadcom driver. After saving and exiting, it was time to compile the kernel.

After running make deb-pkg LOCALVERSION=-broadcom I sat and waited... For 18 hours. After waking up the next day, I noticed compilation completed. As expected according to the Debian manual, the .deb files were one directory up. So after verifying they were there, the kernel was then installed via dpkg and a reboot verified the custom kernel was installed.

It Works, But...

After installing the custom kernel, WiFi was finally working. However, the speed did not increase and the performance wasn't as advertised. This led me to believe it was due to a hardware limitation and I had wasted all this time over a stupid WiFi card...

I later would just install an Ethernet wall jack where I kept my laptop because I wanted the speed at the time, and this laptop would go on to last until 2012, when it gave off magical blue smoke.

Moral of the story: DON'T THROW AWAY GOOD HARDWARE BEFORE YOU TEST!

Read more...

I had an encounter with a friend who had an insatiable craving to talk about his newfound love for this cool new thing called Python. When he claimed that it was the “most popular” and “the best” I had a rough time gathering my thoughts to counter this claim. For those interested in this little rant, this is the point I used to aid in tempering my friends unconditional love for a programming language.

Due to the Python programming language's surge in popularity, people's intense love of the language has also surged. Although this is a blessing for the Open Source community and attracting budding software developers, many of the avid Python programmers (Pythonistas) often claim that it's one of the most popular languages to exist. On top of this, many won't listen to counterclaims about the use of other languages.

What's really the most used?

To tackle this question, one must first define the term popular. For the sake of discussion it is easiest to say that the most widely deployed programming language should be considered the most popular for the simple reason that the enterprise world, the government and many major software companies are still making use of it.

Given this definition, we can safely say that the C Programming language easily is the most popular in this regard, as it is deployed in virtually every Operating System that exists, sees use in most (if not all) of the DNS implementations we use today and more. For those who don't know, DNS provides sites with actual names instead of IP addresses.

Among the most deployed would be JavaScript, as most webpages and many mobile apps utilize it.

According to the definition of popular in this post, this makes COBOL one of the most popular languages, as approximately 90% of the banking system is powered by this old language from the 70s.

COBOL????

Yes. COBOL, I hate it too and I would rather remove my ear with a rusty spoon than code in it. Sadly it's so widely deployed and in so much demand in the enterprise space due to the simple purpose that most of the old Application Programming Interfaces (APIs) for large companies such as banks and car manufacturers were created in the 1980s. Pairing this logic with the fact that it's much more expensive to rewrite these APIs than to just maintain them, COBOL has a permanent place to stay as one of the most popular languages to come.

Remember that as defined above, popular means “most widely deployed” in terms of this discussion. Python definitely outpaces programming languages like LISP or Pascal in terms of popularity, but doesn't quite make the dent that JavaScript, C or COBOL have.

On top of this, due to it's nature of being an interpreted language, it lacks the ability to perform well in environments where every bit of speed matters. As a result, it doesn't appear as much as compiled languages in mission-critical environments.

HOWEVER...

If we define popular as the “most well received” programming language, Python easily lays claim to being one of the most convenient languages to code in and being one of the top languages most programmers choose to learn first nowadays.

On top of this, it's important to note that strides are being made in compiling Python code, though it's not quite stable at the moment. Once this gets achieved, it is possible that it could quickly take over the enterprise space.

The Real Question

How do you change someone's mind about their love of a programming language? You don't- but you can help them clarify their logic more so actual discourse may occur.

That's all. Rant over. Be sure to stick around and bookmark this page for new posts! Next time, I'll share some some (mis)Adventures in building Linux From Scratch and more technical content.

Read more...

Enter your email to subscribe to updates.