Jump to content

OT: Coding/programming


ArrMatey

Recommended Posts

  • Members

I know it's an off topic but after nearly 8 years of not having touched any coding, I've come to a crossroad of my life where I understand that I won't put food on my family's table unless I get my ass back into coding. I haven't programmed/coded for a long time and I'd like to eventually get back into it.

 

I've started doing the tutorials on Codecademmy, a friend has recommended getting eclipse and working on an android tutorial which all seem interesting. Anyone else do this kind of work? I just think it could be something to get back into since It's so in demand at the moment.

Link to comment
Share on other sites

  • Members

I've made my way through a large part of the java codecademy stuff and thought it was all really awesome. It says there is a python section but I can't seem to find it.

 

I'm most fluent in stuff like FORTRAN and idl for comp phys stuff so java is pretty different for me (objects are particularly interesting).

Link to comment
Share on other sites

  • Members

Study fundamentals. By all means get some language/platform coding experience but set yourself up right to succeed. Learn effective TDD (Test Driven Development), Growing Object Oriented Software Through Tests is a good book on this. Learn and understand the SOLID principals (if you are going OO, FLUID if dynamic). Design Patterns using the Gang Of Four book. Understand the current architectures for the field you are interested in (MVC, REST (SOA), MVVM, etc). Know how the platform you use works at a lower level.

 

Most of all, have fun :wave:

Link to comment
Share on other sites

  • Members

 

.Net and Databases is where the money's at.

 

 

Depends on what you want to do, do you want to work at a start up and get in on the ground floor, if so, JS, Mongo DB/CouchDB, RoR, etc are the way to go. I guess for me technology is a secondary concern, the concepts and skills to write effective software are far more important; frameworks, languages and the flavour of the month come and go, when I hire devs I look for fundementals.

Link to comment
Share on other sites

  • Members

Yeah, I dived straight into coding 'droid for a coursework without doing the javaey stuff first properly and really regretted it. There's a lot of very specific stuff in android that it's much better to understand in a wider context.

Link to comment
Share on other sites

  • Members

There's a lot of demand now for C# and .NET in the corporate environment. I can understand why. If you're thoroughly immersed in the .NET class libraries then you can crank out very nice looking GUI apps in a very short amount of time.

On the downside, .NET apps are bulky and slow. Microsoft has an extraordinary talent for taking every new advancement in computer hardware and sucking the juice out of it until it runs like a 486.

The other problem, in my opinion, is the learning curve brought on by the sheer size of the class libraries. Programming directly to the Windows GDI is like building a house with a small tool box - you'll need a variety of different tools to get even the simplest job done, but you'll eventually get it finished. Programming with .NET is like building a house with a massive warehouse full of tools - somewhere in that warehouse is the one tool that does exactly what you want in one step, but you've got to hunt for it and then learn how to use it. It gets even worse when you start piling third party libraries on top of .NET, or massive custom libraries that were developed in-house that the company forces you to use. Heaven help you if the guys who developed those libraries didn't document their work.

The upside is that once you've figured out where everything is you can be uber productive. :thu:

I can relate to fly135. I migrated from electronics engineering to coding in the mid 80's, and have bounced between the two since then, though mostly coding for the past 20 years. It was a lot of fun back when you could beat on the hardware and try to push the envelope a little. These days that envelope is way the hell out there, and the OS puts up a massive brick wall between you and the hardware. Not much fun anymore, but at least it pays pretty well.

Link to comment
Share on other sites

  • Members

There's a lot of demand now for C# and .NET in the corporate environment. I can understand why. If you're thoroughly immersed in the .NET class libraries then you can crank out very nice looking GUI apps in a very short amount of time.


On the downside, .NET apps are bulky and slow. Microsoft has an extraordinary talent for taking every new advancement in computer hardware and sucking the juice out of it until it runs like a 486.


The other problem, in my opinion, is the learning curve brought on by the sheer size of the class libraries. Programming directly to the Windows GDI is like building a house with a small tool box - you'll need a variety of different tools to get even the simplest job done, but you'll eventually get it finished. Programming with .NET is like building a house with a massive warehouse full of tools - somewhere in that warehouse is the one tool that does exactly what you want in one step, but you've got to hunt for it and then learn how to use it. It gets even worse when you start piling third party libraries on top of .NET, or massive custom libraries that were developed in-house that the company forces you to use. Heaven help you if the guys who developed those libraries didn't document their work.


The upside is that once you've figured out where everything is you can be uber productive.
:thu:

I can relate to fly135. I migrated from electronics engineering to coding in the mid 80's, and have bounced between the two since then, though mostly coding for the past 20 years. It was a lot of fun back when you could beat on the hardware and try to push the envelope a little. These days that envelope is way the hell out there, and the OS puts up a massive brick wall between you and the hardware. Not much fun anymore, but at least it pays pretty well.

 

I'm really sorry to hear that's the state of coding today, but not at all surprised.

 

I left the ee/sw world in about '95 and never really looked back into it closely. Some of the earliest commercial C++ code was written at my old place of work (Mentor Graphics)...I can remember meeting Bjarne at one of his flyout-talks he gave with us....he was rather stunned to hear we were going to release products based on his UNreleased compilers...basically, he thought we were crazy and was worried as hell. I hope that guy writes a book about the HUMAN side of the coding world...I would buy it...pretty funny guy, that Bjanre.

 

Anyway, there was a dearth of classs back then (in the 80's) and so much of what we wrote was heinously inefficient...can you imagine a program that instantiated a 12 megabyte memory partition just to house "text and a page"? ... well, I imagine there's a bunch of that nowadays. Just the massive inefficiencies due to variable name conventions...I wonder if that has changed? I got disenchanted and moved on. Has the industry ever got beyond the concept of "you need to have obtained a degree within the past 10 years to be employable?" That was the one that pushed me out the door, since what I was doing was not being taught (in a focused manner at the time). It was very sad to see many bright coders being pushed either into management (where they regularly performed at mediocre-to-inept levels) or pushed out the door.

 

Huh....just imagine how incredibly fast today's machines would be if we didn't have all those mandated inefficiencies. Latency issues in DAWs would be non-existant.

 

...a warehouse of tools, but no map for access. Isn't there some precident in ancient history....would the Tower of Bavel apply..."all the people speaking one language" being scattered then learning separate languages....then they could not talk to each other....it won't be long before elements of the C++ world get to be like that. I would expect trememndous inefficiencies (classes being re-invented over and over) and the slow evolution into specialized dialects which can no longer be linked together.

 

...and then the hand of Stroustrup smote them down, and He proclaimed "for you, there will be no New land. You have overloaded that which has been overloaded before, losing sight of the origin of classes. What was a Beagle is now a Crystal Palace. This frontier you have created is a vast desert containing only 8 bits of wisdom...and you use but 2 of those. Compiler Error. Parity Lost. It OK, it go down drain." And with that he toggled the corruption-bit in the next (and last) c++ compiler build, and de-created the world he had built. Slowly, and at random over the next 2 years, programs began to fail as bitterpill__arsenic opened and executed . All pointers to dev_null. And all went to black.

Link to comment
Share on other sites

  • Members

I would highly suggest learning the latest .Net framework and focus on using C# as the programming language. As others have said, spend a lot of time learning the fundamentals of object-oriented programming as it will also apply to other modern programming languages.

As far as developing applications and more so the interface side of applications, I would focus on web-based apps rather than client apps that are actually installed. Most corporations tend to favor web-based (ie. web browser) business solutions for the obvious ease of deployment. I would pick up a good book on ASP.Net MVC and Razor. You should be able to find some books that will cover a lot of the C# basics, ASP.Net/MVC, and database access all in one.

Once you get a good grasp on some of those basics, you should start learning how to program against database backends such as SQL Server (one of the easier ones to start with). If you don't know SQL (structured query language), you'll definitely want to learn that as it's applicable to most databases. Most bigger companies will either be using Microsoft SQL Server or Oracle as their DBMS (or even both). You can download personal copies of either DBMS for windows.

Those would be good starting points. There is much much more to learn after this but there is no sense in overwhelming yourself with everything at once.

The recommendations that I've made are based off of being a professional programmer for ~18 years. I've worked as a consultant, a contractor, and full-time employee for quite a few fortune 500 companies. The Microsoft programming tools have consistently been the most popular programming platform for the majority of my career, and there are more programming jobs available that want those skills than any other.

Link to comment
Share on other sites

  • Members
I'm really sorry to hear that's the state of coding today, but not at all surprised.


I left the ee/sw world in about '95 and never really looked back into it closely. Some of the earliest commercial C++ code was written at my old place of work (Mentor Graphics)...I can remember meeting Bjarne at one of his flyout-talks he gave with us....he was rather stunned to hear we were going to release products based on his UNreleased compilers...basically, he thought we were crazy and was worried as hell. I hope that guy writes a book about the HUMAN side of the coding world...I would buy it...pretty funny guy, that Bjanre.


Anyway, there was a dearth of classs back then (in the 80's) and so much of what we wrote was heinously inefficient...can you imagine a program that instantiated a 12 megabyte memory partition just to house "text and a page"? ... well, I imagine there's a bunch of that nowadays. Just the massive inefficiencies due to variable name conventions...I wonder if that has changed? I got disenchanted and moved on. Has the industry ever got beyond the concept of "you need to have obtained a degree within the past 10 years to be employable?" That was the one that pushed me out the door, since what I was doing was not being taught (in a focused manner at the time). It was very sad to see many bright coders being pushed either into management (where they regularly performed at mediocre-to-inept levels) or pushed out the door.


Huh....just imagine how incredibly fast today's machines would be if we didn't have all those mandated inefficiencies. Latency issues in DAWs would be non-existant.


...a warehouse of tools, but no map for access. Isn't there some precident in ancient history....would the Tower of Bavel apply..."all the people speaking one language" being scattered then learning separate languages....then they could not talk to each other....it won't be long before elements of the C++ world get to be like that. I would expect trememndous inefficiencies (classes being re-invented over and over) and the slow evolution into specialized dialects which can no longer be linked together.


...and then the hand of Stroustrup smote them down, and He proclaimed "for you, there will be no New land. You have overloaded that which has been overloaded before, losing sight of the origin of classes. What was a Beagle is now a Crystal Palace. This frontier you have created is a vast desert containing only 8 bits of wisdom...and you use but 2 of those. Compiler Error. Parity Lost. It OK, it go down drain." And with that he toggled the corruption-bit in the next (and last) c++ compiler build, and de-created the world he had built. Slowly, and at random over the next 2 years, programs began to fail as bitterpill__arsenic opened and executed . All pointers to dev_null. And all went to black.



With all due respect to Bjarne Stroustrup, who is a brilliant programmer, I think C++ was a giant step backward, but let me start from the beginning...

C is an abomination. It was intended to be a low-level/high-level language. By that I mean it was intended to be a high-level language, independent of the processor architecture, but low-level in that it had a very limited set of intrinsics which were about as close as you could get to machine language level operations without actually writing in assembly language. The goal was to produce a language that would compile to reasonably small and efficient programs, perhaps no more than 10% or 20% slower than a comparable program written in assembly language.

What's wrong with C is that Dennis Ritchie designed it using a Teletype 33, the most common terminal in use at the time. If you've ever used a Teletype 33 then you know that they are SSSSLLLLOOOOWWWW! You press a key (with considerable effort) and the key stays locked down while a marvel of mechanical engineering kicks in under the hood. After about 1/2 a second of whirring and buzzing that single character would have finally been transmitted, received back from the mainframe and printed, and the key would pop up again. You're now ready to type another character.

It doesn't take a genius to understand that entering even a short phrase can be time consuming. Touch typing? Not in your wildest dreams! Although the keys are layed out approximately like a typewriter, they are round rather than square, they take far more effort than even a manual Olympia typewriter to press, and (as mentioned above) it's really really slow. At the forefront of Dennis Ritchie's mind was entering lines of code with the fewest possible keystrokes. This meant very liberal use of symbols (those little characters on the number keys that you have to hold the shift key down in order to type).

Fast forward 10 years. Mainframe and minicomputers now mostly use video terminals. Desktop personal computers are becoming common. Touch type keyboards abound. And programmers are having to learn how to type C code, with it's heavy use of symbols, with their pinkies frequently reaching for the shift key. A programmer who learned how to touch type at a respectable speed would be lucky to get 1/4 of that speed when entering C code because he's constantly reaching for the shift key. I don't know about you, but I can type "or" a heckuva lot faster than I can type "||". Even the most common characters - the curly braces - require the shift key.

By this time even Dennis Ritchie is wondering "What was I thinking?" :facepalm:

Then there are some confounding syntactical conventions. You can type a statement line as long as you want. It doesn't matter if it won't fit on a single line on the Teletype 33 because you can continue it on the next line. You can get away with this because a statement line doesn't end until you type a semicolon. The only problem with this is that there is virtually no limit to the length of a line when using a video terminal or computer screen. Using multiple physical lines for a single statement is an exception rather than the rule, so providing a way to extend that statement line to the next physical line should have been an exception. Nope. You gotta type a semicolon to end the line.

But wait - That rule doesn't apply to preprocessor directives. Those MUST end on the current line, or you have to use a backslash to continue to the next line. Nothing like having two different conventions for the same thing.

C++ - a noble effort to try to bring the concept of object oriented programming to C language programmers. The problem? C was meant to be the lowest of the high level programming languages. Efficiency and speed and all of that. Now we have the compiler doing all sorts of magical things behind your back, and if you don't explicitly understand what the compiler is going to do then you may not get the results you expect, and it may take some head scratching to figure out why.

Create a new instance of a class? No problem! But wait - why did we run out of stack space? Well, that's what happens when you create instances of classes that require a lot of storage, and you create them in local scope. Of course, you have to know something about the storage requirements of a particular class in order to figure that out.

Overloading functions - what a wonderful idea! Now you're staring at someone else's code trying to figure out which overloaded function is going to be called because you have no idea if "value" is an integer, a long, or a float. First step is to figure out where the hell "value" is declared.

Overloading operators - even better! Why the hell is this guy adding two strings together? Oh, wait. He's overloaded the "+" operator to perform string concatenation. You're struck with the realization that the guy who wrote the code has effectively created his own language, albeit with a syntax similar to C, which you now have to learn if you're going to figure out where the bug is.

I'll leave C# for another rant... :cop:

Link to comment
Share on other sites

  • Members

 

 

 

I have heard from my Computer Science teacher and friend back in 2000 about this. It has been a bit of a hassle and my biggest problem that I have with most computer languages is their often completely different approach to object-based programming. Each languages uses a different set of tools to call upon. Finding the algorithm and the pattern that works is one thing, you need then to apply it to different languages and esp if you are switching platforms.

Link to comment
Share on other sites

  • Members

I have been a professional developer for around 20 years and a .NET developer for 10 of them. I think the pertinent question for the OP is - is it worth investing time in? Will it get me a good job? Will it still be one of the dominant languages in 5-10 years?

 

I think .NET is on the decline as a percentage of the available jobs. Microsoft as a leader in the industry is declining, and newer technologies have been gaining traction. Having said that, there are tons of large applications and systems running fine on .NET and even if all new dev were to stop today it will be around for a long time.

 

Personally, if I were starting over again today, I'd focus on iOS development and/or HTML 5. I have no doubt that with some focus you can get a job within a year.

Link to comment
Share on other sites

  • Members

 

I have been a professional developer for around 20 years and a .NET developer for 10 of them. I think the pertinent question for the OP is - is it worth investing time in? Will it get me a good job? Will it still be one of the dominant languages in 5-10 years?


I think .NET is on the decline as a percentage of the available jobs. Microsoft as a leader in the industry is declining, and newer technologies have been gaining traction. Having said that, there are tons of large applications and systems running fine on .NET and even if all new dev were to stop today it will be around for a long time.


Personally, if I were starting over again
today
, I'd focus on iOS development and/or HTML 5. I have no doubt that with some focus you can get a job within a year.

 

 

Thanks for your insight. I'll def read up on that. I think I have a lot on my plate to get anywhere before I can master all those but I really appreciate the advice as you guys probably know better the Coding trends that are coming in the future.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...