Sunday, December 4, 2011
Well, generally, FTP and HTTP can be quicker, and more secure as they do not rely on crowd sourcing or seeders. If set up correctly, the entire file is located in one place and can be continuously drawn down without risk of interruption.
2) Which command would you give to perform a complete upgrade?
up2date -u or yum update
3) Why would you build a package from the its source code when a (binary) deb file is available?
The difference between building your own package from the source code and using a deb file is the difference between making pancakes and using bisquick; while the pre-built binary file may serve, you have greater options and control when you build the package from 'scratch'. You are able to customize it to work best on your system, fix bugs, and use the most recent version available.
4) Suggest two advantages that deb files have over source distributions.
Again, deb files are like biscuick, or box cake mix... they allow for quicker installation and deployment, as well as allowing for automatic dependency resolutions between files and libraries.
5) When you compile a package yourself, rather from a deb fil, which directory hierarchy should you put it in?
6) Which steps should you take before performing an upgrade on a mission-critical server?
The best bet would be to clone your server exactly on a second system and then run the upgrade on the clone while observing closely for fails and errors and for what sort of configuration information needs to be tweeked by hand so that you can insure the safety/functionality of the full upgrade on the real server.
Saturday, November 19, 2011
The Open Prosthetics Project says, "Prostetics shouldn't cost an arm and a leg", and they are putting their money where there motto is. They have changed the approach to meeting the needs of disabled people by turning to open source and reverse engineering on their projects. Reverse engineering has long been a bad word to the corporate world. As soon as your technology is on the market, somebody is going to take it apart and figure out how to make it themselves, thus becoming competition. Or worse, they may find major flaws that could embarrass the company and cost millions. Companies spend a lot of money on research and development, and they don't like to have all that effort go to waste. But for prostheses, reverse engineering means the difference between vast improvements, and reinventing the wheel for every varied situation. Now they can take a design that already works, and find ways to fit it to different needs and lifestyles, or improve on general problems.
One such project is the Body Powered Hook project. As the simplest and most common upper body prosthesis, it's something we're all familiar with. Grasping hooks at the end of an armcup that the user operates by secondary, unrelated movements of their body. They've been around for decades, and come with a myriad of challenges for users. The OPP came to life when founder Jonathan Kuniholm returned from Iraq without his arm, but with three different prostheses, one being a BPH. Immediately, Jonathan and his partners at Tackle Design saw ways they could make improvements on the design, and thus improve the lives of millions of disabled people. Rather than a grasping hook which is limited by the strength of springs or user, the prototype uses a 'vector prehensor'. It's a pin and slot set up reminiscent of adjustable plier and vise-type wrenches found in a garage which relies on the movement of the pin to change the fulcrum of the hinge, thus increasing or decreasing the pressure with with the graspers close.
Another project is called the pediatric trainer. Imagining children with amputations is a sad thing, but imagine being a parent trying to facilitate your child learning to operate their new prosthetic when they're still having trouble with natural motor skills like walking or operating their remaining hand! Children and parents must attend physical and occupation therapy sessions with specialized trainers who help the child learn to use and understand their prosthetic by asking the child to complete a task such as gripping an item, and providing positive feedback when the child completes the motion required to accomplish this task. It is a time consuming and stressful process for the child and parent. OPP seeks to create an automated training system that can measure the movements and force of the child's effort, and play appropriate feedback information. This way, children will have constant reinforcement of their work, rather than only getting the precise training in physical therapy sessions.
By developing projects and opening them up to the public, not only do they allow people to take their ideas and create them in more cost effective ways, but they also increase the number of people looking at a problem, and increase the number of creative solutions. Just like with software, new flaws can be found more quickly, and innovative changes flow more freely.
Sunday, November 13, 2011
Right now, the scariest part of all, anyone with a computer can search and view anything, at any time, from the comfort of their home, office, public library, mobile phone, or internet coffee shop. So your grandma can keep up with her grandchildren, find her favorite recipes, and research brands and products while your brother can peruse (yes I mean peruse, as in intently look, not "browse" meaning to skim over) videos of people hit by buses and midget porn, while trolling the Sesame Street forums. It also means that if somebody decides to link to a shock site in grandma's kitten video comment section, she and countless other unsuspecting people with know EXACTLY what 2 girls, 1 cup is.
But when you really break it down, the Internet is about commerce. Best Buy, Walmart, Amazon, Overstock, New Egg, Barnes and Noble, and homemade wholesaler Etsy are all vying for your retail dollars. If you search for a product using Google, Bing, or Yahoo, you will get results from a litany of sources- large and small- and have the freedom to evaluate the options yourself. You might find momscameras.com, a small store in New Haven Ct., sells that Canon 50D for MUCH cheaper than Best Buy, and even shipping it to Nevada, you get a bargain. Or you might find that your text books are way cheaper from Amazon than your college bookstore, if you're willing to wait for them. If you want to research a product you can find reviews from dedicated sites and from angry customers, they you are able to figure out for yourself if that much hyped holiday-must-have is a good purchase.
It's a double edged sword. But you have to take the good with the bad. Or at least that was the FCC's stance. For years, the Federal Communications Commission enforced standards on web service providers like Comcast, Charter, AT&T, etc. These providers must, no matter the content of the site, provide equal and open access to information. Amazon couldn't pay to be the first/only website to come up when you search for books, the New York Times can't pay to push the New York Post to the back page. The FCC required companies to be neutral, fair, and disallowed them to use their power in providing access to alter or limit that access. They aren't allowed to play gate-keeper. Or they weren't, until an appeals court decision in 2010. Comcast challenged the FCC power to regulate broadband internet services, and the court agreed. So in December 2010, the FCC had to agree to a compromise, granting more power over the content of wireless Internet providers to the providers themselves. Service providers and investors hail these changes as positive, but opponents from all over note the potential for abuse of power.
Not only could companies pay to be put ahead of their competition, businesses could pay to eliminate sites they find questionable. Take file-sharing sites like The Pirate Bay or Utorrent. The MPAA and RIAA have been at war to stop their content from being traded over the internet using these sites for years. Congressional hearings, injunctions, raids, lawsuits against children and grandmothers, and subpoenas have flown left and right. But under the new rules, not only would Sony-MCA be able to pay for prioritization over RCA, but the RIAA could pay Comcast to filter and disable any searches or links to torrent sites. The law bans ISPs from blocking access to 'legal' content, but file sharing of copyrighted material is illegal, so that's ok, right? No- because file sharing sites and torrents are not only used for illegal file sharing. Many smaller artists and companies make their files, and their products available for fair-use, and wide dissemination using torrents. Even 'major' musical acts like Radiohead and Nine Inch Nails have released their albums over the Internet in this manner.
Additionally, rules were added to allow ISPs to create tiered access systems. Like with data plans currently, you pay more for more speed, more bandwidth, and a growing number of providers are charging more for how much data you actually use. But current plans are in works to allow them to charge you more for "premium" web pages. So for $35 a month, you can have access to Walmart.com, government services, and YouTube, but if you want Facebook and the Wall Street Journal, you'll have to purchase the $50 plan. FOX News videos will load at 16Mbs speeds, but CNN videos will be throttled to 5Mbs- you'll be waiting longer for content to load from sites that don't pay up. That is, if you can access it at all.
Then the worst possibility of all comes up. Who, if not the FCC, gets to decide what is illegal content? The Internet does not obey political boundaries- there are no state or country boarders between sites. But laws do vary, not just from country to country, but from state to state. With power shifted into the hands of the people who stand to benefit MOST from this arrangement, there is little incentive to play fair. Many countries like China, Iran, and North Korea have strict bans on illegal content. So you can't access Facebook, Twitter, CNN, or Google. What if ISPs like CommCast or companies like Microsoft, who find themselves frequently in competition with sites like Google, decide those sites shouldn't be available to their customers? Or what if a site is illegal in China, and a ISP is owned by a Chinese company- so they block access to that site by their American clients? What if companies pay to 'throttle' access to articles and information that might be negative to their business, or demonstrate criminal action?
This compromise, and the loss of power by the FCC is no compromise at all. It is a direct effort by powerful companies to limit the flow of information. Where the Internet used to be an equalizing factor, a bit of the American dream where anyone can find or be anything, it is at risk of being turned into a class system. Knowledge is power, and somebody wants to decide who gets to be plugged in.
Sunday, November 6, 2011
Having a background in Linux opens up the world. You can work in ANY sector- public or private. You can work in any industry, like aerospace, education, hospitality, politics, sales, transportation, shipping, information technology, research, marketing, engineering, quality assurance, gaming (casinos), software, healthcare... the list is endless because of the prevalence that technology and computers have in today's society. Virtually EVERY company has a website and uses computers in their day-to-day operations and those computers and websites need systems administrators, network engineers, programmers, IT troubleshooting, and designers to design, run and repair their computer operations.
Although many companies are satisfied with background and experience over formal certifications it is likely that in the future these certifications will be more common place. LPCI1 is an 'entry level' certification. You must pass two exams and are expected to demonstrate an ability to work a command line, execute specific tasks, run routine maintenance and assist users, and you are also expected to be able to set up and network virtual workstations, or individual computers. LPCI2 steps it up a bit- you must have received your LPCI1, and another two exams demonstrating you are able to perform more detailed maintenance on a larger site, run automation, manage assistants, and administer to a mixed systems including Microsoft, Linux, Internet, etc. The highest certification you can receive in Linux is LPCI3- this consists of a 'core' certification earned with a single exam, and multiple other specializations in areas like security, or mixed environments. People who take the level three exams generally have already been working in the field doing sys, admin. for number of years. They are able to run large networks with many computers, can work seamlessly with a variety of operating systems and different technologies, they must know at least one programming language (like Perl, C, or Java), and be skilled/trained in all levels of Linux like security, installation, management, troubleshooting, and maintenance. No small task!
More and more companies and even countries like Brazil are switching to Linux or Linux based software making experience and certification with Linux vital for their IT employees. Benefits, depending on experience and additional computer knowledge range from $40-$50 an hour, to $200k/annually. You'll also find things like contract work for single projects, or some people find success working 'freelance' for multiple smaller companies and groups on contract for their IT work from their own homes.
One of the best benefits people find with these jobs is the variety of locals- in the tech fields you can work from home, or move to any part of the world. This doesn't mean you are guaranteed a fancy job and lots of money for learning Linux- the hours of work are long, experience is highly valued, and you have to have diversity in your skills to make sure you are a valuable asset. Often you're on-call, and people will take out frustrations with technology on you. IT also carries a workplace stigma, which is great for television and humor, but can be isolating. For me, with a background in Political Science and soon to be a Masters in Criminal Justice, I see computer science knowledge as a way to keep up with crimes and they evolve, but also as a way to elevate the perception of my existing skill-set in the workplace. After all, experience counts, but it isn't everything.
Sunday, October 30, 2011
For anyone who's worked in a business, the 'true cost' of software is less of a surprise, but all the more painful. Business and Commercial software licenses are astronomical. The cost is justified by the assumption that the license will be used by a larger number of people, or used to make money, but it doesn't always make it any easier to shell out thousands of dollars for software. For a small business license, popular computer site Newegg.com sells Microsoft Small Business Server 2008 for $3500. That's just for the permission to use the software, not the software itself. You can't even get pricing for use in larger businesses or government offices without setting up an account with a Microsoft Rep.
The extreme costs can be hard for commercial interests to accept, but they can be crippling for non-profit and public enterprises. The public sector is under tight scrutiny and even tighter monetary stress- the costs of software licensing can eat away at funds that are needed elsewhere. It becomes even more of a hardship when you take into account that you're not looking just at the cost of an opperating system, but that every program you use requires an additional license. Music software, word processing software, media and image software, spreadsheet and database software, even email and messaging software. The true cost of running a Microsoft or Apple office can skyrocket.
Open source suddenly becomes a much more palatable option, especially now that many open products are being developed with shells that so closely resemble the commercial OS's people are familiar with. Not only that, Open Source products can be custom tailored to the needs of each business without violating the license. Companies and Organizations often employ their own IT staff, and open source software allows an opportunity for IT departments to have more control and involvement in their software. It seems that proper utilization of Open Source software can not only have monetary benefits, but also quality benefits for many public and private organization. Since this is an older story, I'd love to see how, 7 years later, this transition has gone and what sort of effect they are seeing.
Sunday, September 18, 2011
1. What is an argument? Give several examples.
An argument is short form for "command line argument", information provided after the command itself indicating the directory or file to which the command should be applied. You use it to indicate specific files like file1, file2, file8, or directories like home.
2. Use the man pages to tell me two options for the ls command and what they do.
The options for the ls (list directory contents) command: -s is sort by file size, and -r is reverse order while sorting.
3. Use the internet to look up "The Cathedral and the Bazaar" and tell me what it is and why it is important.
"The Cathedral and the Bazaar" is an essay/book written by programmer Eric S. Raymond. Originally it was presented by Mr. Raymond to the Linux Kongress in 1997. Mr. Raymond's essay outlined the conflicts that arose from attempting to use the traditional (Cathedral style) method of software development for open source programming. Mr. Raymond established guidelines for creating good open-source software, and was a leading force in the movement towards the bottom-up development and bazaar style. He believed that by releasing your code early in the process and as often as possible, you not only benefit the project by allowing others to work it and develop it from there, but also by turning your users and peers into a beta-testing group to help tackle possible bugs and solve problems with greater efficiency.
Mr. Raymond's succinct argument persuaded the Open-Source community to formally adopt Bazaar as their method of development. His points were based not only in highly practical approaches to problem solving, but also in a sort of moral code- obligations of users and developers to one another, and the community. These principles are the foundation of the open-source movement and had a direct impact on things like Mozilla (natch FireFox), Google and Google Labs, and much user-generated content like Wikipedia and such.
Sunday, September 11, 2011
The word kernel means the fruit or meat of a nut or seed removed and separate from it's shell and exterior properties. It's a perfect description of what we now call a kernel in computing. The Kernel is the meat of the operating system. It is the vital core of what defines the methodology of operations for a computer. A Kernel directs and facilitates the uses of resources by software and hardware that make up a computer or system of computers as well as facilitating communication between different bits of hardware and software. In many ways, a Kernel is like a conductor in the Orchestra. The conductor cues and directs different parts of the music (software) to be used in conjunction with the necessary instruments (hardware), at the appropriate speed, volume, and place (resource allocation).
Tuesday, August 30, 2011
The seven things you don't need to know about me;
1) I have been separated from my husband nearly as long as I've been married.
2) I spend months each year planning, working on, and building a Theme Camp for Burning Man.
3) I am ambidextrous, particularly at the computer, but occasionally it leads me to not be able to figure out whether I am left or right handed during a task, or to be unable to tell if words are written backwards or forwards.
4) I have been a skier for the majority of my life and enjoy it so much I have spent winters as an instructor.
5) I have a '68 Ford Mustang that I had painted Tahoe Turquoise from original Ford paint samples.
6) My cat's name is Ash and I give him the majority of credit for my success working on the Obama Campaign in 2008.
7) I frequently and deliberately attempt random new recipes with little-to-no forethought.