One of our grad student TAs basically did this in class today (subbing for the professor)...

One of our grad student TAs basically did this in class today (subbing for the professor). He did a 10+ entry switch statement to test if a character was in a certain range of the alphabet. To be fair, he may have just been rolling with one of the student's attempts, but even in that case, what a terrible thing to teach.

If this is the level of programming that actual grads are at, I guess I shouldn't have too much trouble getting a job huh?

Other urls found in this thread:

comp.nus.edu.sg/~stevenha/myteaching/competitive_programming/cp1.pdf
twitter.com/NSFWRedditVideo

Why didn't you stop him? Maybe he was testing to see if you were dumb sheep.

that's a commonly used pattern for keyboard event handling, what's the problem?

Because you can solve this problem with a conjunction of two comparisons.

literally how would you do it

case Input_Char is when Low_Char .. High_Char => Do_Stuff; when others => Do_Other_Stuff;end case;

It depends, are you willing to accept the same salary or worse than a guy who shits on a street has? Don't think you are hot shit because you are better than a drooling retard also, this is the biggest mistake you can make.

That works assuming no additional letters are in that range, which is not always the case. For example, the range for Japanese Hiragana is 3040 to 309F. However, inside that range you have characters like ゝ、ゟ, lone voice marks and you may not want all the っ、ぃ、ぅ、ょ type characters. If the number of breaks in the range is significant then a switch statement would be bad, you would probably do something like load valid characters into an array and search it to check.

In OP's example a pure range was probably sufficient, but it's worthwhile to mention that using just a range is not always sufficient.

I took a algorithm course in which you every week get problems to solve and you verify the solution automatically on a website, so there is a time limit.

There were some problems I just can't solve, no matter amount of time or information available on the Internet. But there were some incredible talented guys in that class that just nailed them all consistently.

There were also competitions in which you have no help at all and solve problems during 5 hours or so. Same talented guys could solo them all while others worked in teams of 2 and could not solve them all given the time.

Point is, I also thought I was "the best programmer in the world" before starting my studies. But after I realized there are some incredible talented people that are way, way better than me. So whenever I see threads which say that all grads suck or whatever, I just think back on myself and how naive I was. Not only in algorithms but in math as well and other theoretical areas.

Either that or your universities sucks or something.

Knowing your limits and weaknesses is definitely important - if you aren't aware of them then you can never improve yourself.

These people who just knock out algorithm problems - I won't say they're not skilled - because they are - but maybe you could say they don't apply themselves to real world problems and they don't innovate. I suspect those people are competitive programmers. They memorize algorithms and when to use them in different cases (anyone heard of "blossom algorithm" for that one particular type of problem? no? ooohhh so sorry guess you need to solve it with the dynamic programming approach then). They study how to translate a problem statement to recognize the underlying algorithm. But solving these challenges becomes the end unto itself for many. And the algorithms they use - they're just echoing out what they've learned - how many of them innovate and improve the original or discover a new, better algorithm? Not many. So I wouldn't beat yourself up over it.

I've been looking into competitive programming as it's a good way to improve overall programming skills. You may want to read the book "Competitive Programming" by Steven Halim. You'll learn a lot, and the mindset to solving problems is useful for attacking new ones.
comp.nus.edu.sg/~stevenha/myteaching/competitive_programming/cp1.pdf

you forgot to mention that the array must be sorted and search must be done with binary search algorithm

anyone with a minimal of communication skill can do that. If you're seeing yourself in high position, doing project that matters, giving lecture, writting a book and making papers for any field,...all of it before your 40s,....you might reconsider your life.

It's not about how much you know.
Do you have charisma? are you good looking? cause that's what really matter these days.

I'm pretty sure there isn't even a concept of alphabetic order for moonrunes.

My prof literally wrote this today and i cant even
sub permute (&@) { my $code = shift; my @idx = 0..$#_; while ( $code->(@_[@idx]) ) { my $p = $#idx; --$p while $idx[$p-1] > $idx[$p]; my $q = $p or return; push @idx, reverse splice @idx, $p; ++$q while $idx[$p-1] > $idx[$q]; @idx[$p-1,$q]=@idx[$q,$p-1]; }}

There are definitely smarter people than me at the uni, but it's still surprising and reassuring to see the level a lot of people are at. There are pretty much no students who use Linux or are comfortable with CLI, people don't have an amateur understanding of hardware architecture, and then I have a TA doing Holla Forums shitpost meme code in class. I'll have a lot to learn, but now I know I'll be able to excel.
/blog

Using GNU/Linux and being good at CS have nothing to do with each other. Here's a nice picture for your demonstration.

Using linux:
Yes, of course you can know the theory and be an amazing coder while never leaving Visual Studio or whatever. But using Linux does put you at an advantage in some ways.

No, I already said I have a lot to learn. I'm not very skilled at all yet.

As you said yourself, it's entirely possible to be a great programmer without using GNU/Linux or other Unix clones. It's a mistake to look down on your peers because they don't use it.

I wasn't talking about looking down on them in terms of their abilities, but come on.

If you're using Windows 10, you're taking it in the ass from the NSA, and you will never get to look at the source for 99% of the programs you use. Being okay with proprietary software means you see coding as something separate from politics, which is the attitude that's leading us straight to a hellish dystopia. And the majority of these students don't even have a reason to stay with Windows, they're not art majors, they're programmers.

You're being incorrect on one major point. Nothing about those has anything to do with being a good programmer. A good programmer may care about these issues, but so do many bad programmers and people who don't program at all. All you've done is find a different reason to feel superior to your peers.

You have fuckin autism m8, I literally never said it did.

...

Windows users are inferior pleb, deal with it. Also, programming and science in general isn't an end in itself.

There is nothing philisophically wrong with this. Check your ASCII privilege. Check your weakly typed privilege.

Harmful

I understand that one can practice to become good at applying algorithms at problems, you can definitely memorize a bunch of algorithms. It were the usual graph algorithms and so on, though also mathematical problems or probabilities that do not really have a algorithm. However, there were also problems that did come close to real world problems that companies needed help to solve or at least inspired from. So I don't entirely agree with your argument.


The computer section at my university only used GNU/Linux on their stations, first Debian then they switched to Ubuntu. Many courses required you to use the terminal such as working in a REPL, or using pipes etc. I remember using iptables in security course. Basic stuff perhaps but it gets you used to working in such an environment. When I did the competitive course me and my student partner used vim for example to write the code. It is not located in USA so maybe they have not been bribed to death by Microsoft.

Because it's most likely a demonstration of the switch statement you autistic smartass.
Also, ASCII isn't standard in C.

what am I looking at here? What language is that even?

Doesn't give it away? And you seriously can't tell what language that is?

it's perl

Ah so thats why i cant tell what the hell is going on in that code.

lol wtf, By case #3 I'd be bored out of my mind and scouring stack overflow for how to do "if character is in set" in my language.

Probably this. Or maybe the class had only covered switch so far, and he didn't want the dumber students to get confused with a more advanced constructs. Sometimes it's easier to teach the tedious way first and then say "now let's learn how we can write algorithms like this more concisely with this helpful feature of our language".

"Here's how you do it like a retard. Next week we'll cover how to do it in a way that doesn't mark you as a braindead retard. I'll be using this to determine attendance numbers at the end of the semester."

if(i & 1){cout

i's a local variable to that statement. Unless you've got hardware problems, are doing memory fuckery, or have a computer that no longer belongs to you, i's not going to change halfway through.

Everyone's talking about what he actually posted about, not the image.

Sorry, I meant the array value

I skimmed the thread and half the people are talking about the code as usual

...

This pic gets me every fucking time, this is either the best collective bait ever constructed or Holla Forums average programming knowledge.

where in that block of code do you see something modifying the array

Nowhere, but there might be some other process running which would change the array value mid-evaluation. For example, an interrupt routine which updates the values in the array.

well hurf de durf i'm retarded

Nowhere was that a stated assumption, so nowhere in this context would you assume so. Quit being retarded.

You know it's the latter.

Non programmer here. Whats going on in this picture?

Actually there is for hiragana and katakana.

Fucking learn then faggot.

Some CPUs can switch endianness.

People who know nothing about endianness are trying to one-up each other's ignorance.

The only time I can imagine a normal programmer needs to worry about endianness is when he's writing code to communicate unserialized data between devices with different architectures (eg. networking code).

No one codes by hand anymore OP. That's why you get simple pages and programs that take forever to load. Everything is a gui and we're stupid because of it.