Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.
Showing posts with label coding horror. Show all posts
Showing posts with label coding horror. Show all posts

2015/08/01

What Language Should I Learn? Three Answers

A friend of mine, who works in IT but is not a developer, asked me a question during lunch today.

"I want to learn to program. What language should I learn first?"

This is a common question, one I have answered before. But, because I've blogged a lot recently, I've decided to write it up and post here.

I told him I have three answers.

The first is somewhat sarcastic. "I'm going to Europe. What language should I learn?" There just is not enough information to really answer that question, because many languages are very context-specific. If you're hoping to get into programming apps for the iPhone, your best choice is Objective C. If you want to code for Arduino microcontrollers, you'll want to start with the Arduino IDE and it's very C-like language. And, of course, there's JavaScript, your only choice on web browsers.

Where you want to go determines what you should learn.

But there's more to it than that.

There's a thing called the Church-Turing Theory, which states that any real-world calculation can be computed. Turing postulated using a Turing Machine, while Church referenced the Lambda calculus.

We get to a concept called Turing Completeness. A thing that can be computed in one Turing-Complete machine can be simulated in another Turing-Complete machine. The first real use of this was the creation of compilers, of higher-level languages that developers can use which compile to machine code that the hardware itself can run. What it means, for those learning, is that it doesn't really matter what language you learn, that anything one language does can be done by another language.

So the second answer is, Alan Turing would tell you it just doesn't matter which language you choose, that what you do and learn in one language can be simulated or applied in another language. So it doesn't really matter which you choose.

When Jeff Atwood of Coding Horror coined Atwood's Law -- any application that can be written in JavaScript, will eventually be written in JavaScript -- he didn't know the half of it. He knew that graphical applications were starting to be done within web browsers, like Gmail, He didn't know that web server applications and even command-line applications could be written in JavaScript via Node.js. He didn't know that a framework for creating cross-platform mobile applications using web technologies including JavaScript called Cordova would come along. He didn't know that Microsoft would allow developers to create Windows applications using HTML and JavaScript. He didn't know that Open Source microcontrollers such as Arduino would be developed, and frameworks such as Johnny Five would come to allow you to develop Internet of Things projects and even robots with Javascript. It might be a bit more complex to set it up to do these things with JavaScript, but they are possible.

Plus, if your code plans are more functional and computer-theoretical, you'd be glad to know that JavaScript is a Lisp.

If you want to code Objective-C, you need a Mac and the Apple development tools. If you want to code C#, you'll need to install Visual Studio tools from Microsoft (or Mono on Linux). If you want to code JavaScript, you'll need a text editor (and one comes with your computer, I promise) and a web browser (and one comes with your computer, I promise), plus there are places like CodeBin where you can enter your code into the browser itself .

If you're going to be writing an operating system, device drivers, you will want something that compiles to native machine code. If you're looking to get into a specific project, you'll want to know the language of that project. But the corners of the development landscape where JavaScript is the wrong choice are small and shrinking. So, the third answer is, it might as well be JavaScript.

This rubs me a bit wrong. I've set my rep as a Perl Monger, and I always feel like that's where you should start. But while my heart feels that, my mind argues the above, that the greater forces of modern computing are pushing to give JavaScript a front-row seat in the language arena.

But I'm willing to be wrong, and if I am, I want to know. Where am I wrong? What would you tell someone wanting to learn to program?

2012/05/15

Please Code Responsibly.

Jeff Atwood of Coding Horror and Stack Exchange has offended some people by saying "Please Don't Learn To Code". My vector on this is that he's wrong, or at least pointing at the wrong metaphor.
Look, I love programming. I also believe programming is important … in the right context, for some people. But so are a lot of skills. I would no more urge everyone to learn programming than I would urge everyone to learn plumbing. That'd be ridiculous, right?
Yeah, it would be ridiculous. It would be downright silly. There certainly isn't any market for people gearing up and taking care of their plumbing themselves.


Clearly, if you are not a plumber, there are jobs too big for you, or too complex. But if rain, a failed sump pump and the previous owner not installing a check valve on the sump-to-septic-tank pipe have caused a flood of water and some sewage to flood your basement, the cost of replacing and repairing stuff is high enough, and with a little knowledge (which my friend Mike had), sump pumps and check valves are easy enough to install, leaving your money free to purchase bleach and a new mop and to hire a dumpster to hold all the ruined things.

So, it would not be ridiculous to learn something about plumbing. If you wanted to do something more advanced, you'd want to go to a professional, but certainly it is fine for me to learn enough to switch out shower heads and faucets and to fix the disaster in my basement, right?

I think the same thing about computing, although it's more positive than the basement-disaster story. Someone may want to collect and graph their monthly bills to determine if it's time to cut the cord and watch only Hulu and Netflix or something, or one of many other things, or connect things so one email texts all their children and tells them to come to dinner. Or something else. There are wonderful tools to make these things easier for people, like Yahoo! Pipes and iftt, but even then, it helps to have some sense of data structures and algorithms. Not much, but something. Just as there's room between real hackers who write low-level things and people like Jeff and me who write things that sit above them that people use, there's room between us and the people at home who want to connect a new sprayer to the Internet pipe.

Are they going to be professional programmers? Probably not. And are there dangers here? Yes, there are, like there was a danger of a previous owner to not install a check valve, which allowed the septic tank to drain into my basement. But there are worse dangers.

2010/08/23

The Community is Solving My Problems!

The Stack Exchange Crew is pushing a few domain-specific Stack sites for beta-testing. The one I've been most involved in is the Ubuntu one. I had two specific problems which I think I have complained about there. The first is how, for no reason I could identify, I would lose the ability to view the contents of my ~ folder. The other is that I used to be able to connect my netbook's LINE OUT to my desktop's LINE IN and get both audio streams in one set of headphones.

The second one was the easiest. Simply using gst-launch to bridge the in and out of my desktop. I'm running media on my netbook right now, with my headphones plugged into my desktop. It was one simple apt-get install to implement.

The second one took a little more talking, but the top contender is that unstable symbolic links bringing down /usr/bin/ls but not /bin/ls, and my 12 sshfs mount points which I keep in /sshmounts to keep them where du won't search them, that's where the instability is. I'm waiting for the next time problems on my home box or one of the other mount points causes the problem again before I mark it answered, but I think it is.

I have logged into a few others, dealing with computer science and user interfaces, but Ubuntu is the one where I have had the most questions and found the most answers.

2010/07/29

Handwriting Recognition

Jeff Atwood went on about Speech Recognition and how, despite what Star Trek says, it's not likely to become a major mode of computer interface. Not that I've done much with speech recognition, but it sounds about right to me. Honestly, when I want to tell the computer something, I want to poke it with my mouse or trackball or type at it with a keyboard. (When I want it to tell me something, I am not against having it talk to me. I have written a script that makes a time string that's much more friendly to Festival. ("Eleven 55 A M")


Toward the end, Jeff goes on about handwriting recognition, starting with the Apple Newton.
I learned Palm's Graffiti handwriting recognition language and got fairly proficient with it. More than ten years later, you'd expect to see massively improved handwriting recognition of some sort in today's iPads and iPhones and iOthers, right? Well, maybe, if by "massively improved" you mean "nonexistent".
There is a point I think he fails to consider. Apple Newton's handwriting recognition was trying to recognize your handwriting. Graffiti was more about you learning to write the variant of the letter set that Graffiti could understand.

You could take notes on a Palm, if your concept of "taking notes" is "Get eggs on way home". But, if your idea of taking notes was "I'm taking a Senior-level networking course with Comer and don't want to use a paper notebook", you could not really do that on a Palm. The key is that it's faster and easier to type, especially if you can use a dictionary and do lookahead guessing of what you're trying to type. (Although my wife has problems with her Blackberry understanding what she means.) Typing won out in the marketplace of portable items because it's easier to do at speed than handwriting recognition.


But, considering the lineage of the iPhone, it makes me wonder. "Handwriting recognition. Is there an app for that?"