It's time to show off my java hello world with 7 errors on line 34
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
I don't know what I did wrong, but the bug must be somewhere in HelloWorldExampleClassForTutorialBuilderFactory.HelloWorldExampleClassForTutorialBuilderFactory(StringBuilderFactory myHelloWorldExampleClassForTutorialStringBuilder, int numberOfTimesToDisplayHelloWorld)
I know the guy meant it as a joke but in my team I see the damage "academic" OOP/UML courses do to a programmer. In a library that's supposed to be high-performance code in C++ and does stuff like solving certain PDEs and performing heavy Monte-Carlo simulations, the guys with OOP/UML background tend to abuse dynamic polymorphism (they put on a pikachu face when you show them that there's also static polymorphism) and write a lot of bad code with lots of indirections and many of them aren't aware of the fact that virtual functions and dynamic_cast
's have a price and an especially ugly one if you use them at every step of your iterative algorithm. They're usually used to garbage collectors and when they switch to C++ they become paranoiac and abuse shared_ptr
's because it gives them peace of mind as the resource will be guaranteed to be freed when it's not needed anymore and they don't have to care about when that is the case, they obviously ignore that under the hood there are atomics when incrementing the ref counter (I removed the shared pointers of a dev who did this in our team and our code became twice as fast). Like the guy in the screenshot I certainly wouldn't want to have someone in my team who was molded by Java and UML diagrams.
Depends on the requirements. Writing the code in a natural and readable way should be number one.
Then you benchmark and find out what actually takes time; and then optimize from there.
At least thats my approach when working with mostly functional languages. No need obsess over the performance of something thats ran only a dozen times per second.
I do hate over engineered abstractions though. But not for performance reasons.
You need to me careful about benchmarking to find performance problems after the fact. You can get stuck in a local maxima where there is no particular cost center buts it’s all just slow.
If performance specifically is a goal there should probably at least be a theory of how it will be achieved and then that can be refined with benchmarks and profiling.
Writing the code in a natural and readable way should be number one.
I mean, even there it depends what you're doing. A small matrix multiplication library should be fast even if it makes the code uglier. For most coders you're right, though.
Even then you can take some effort to make it easier to parse for humans.
Oh, absolutely. It's just the second most important thing.
You can add tons of explanatory comments with zero performance cost.
Also in programming in general (so, outside stuff like being a Quant) the fraction of the code made which has high performance as the top priority is miniscule (and I say this having actually designed high-performance software systems for a living) - as explained earlier by @ForegoneConclusion, you don't optimize upfront, you optimized when you figure out it's actually needed.
Thinking about it, if you're designing your own small matrix multiplication library (i.e. reinventing the wheel) you're probably failing at a software design level: as long as the licensing is compatible, it's usually better to get something that already exists, is performance oriented and has been in use for decades than making your own (almost certainly inferior and with fresh new bugs) thing.
PS: Not a personal critical - I too still have to remind myself at times to not just reinvent that which is already there. It's only natural for programmers to trust their own skills above whatever random people did some library and to want to program rather than spend time evaluating what's out there.
In my experience we all go through a stage at the Designed-Developer level of, having discovered things like Design Patterns, overengineering the design of the software to the point of making it near unmaintainable (for others or for ourselves 6 months down the line).
The next stage is to discover the joys of KISS and, like you described, refraining from premature optimization.
I think many academic courses are stuck with old OOP theories from the 90s, while the rest of the industry have learned from its failures long time ago and moved on with more refined OOP practices. Turns out inheritance is one of the worst ways to achieve OOP.
I think a lot of academic oop adds inheritance for the heck of it. Like they're more interested in creating a tree of life for programming than they are in creating a maintainable understandable program.
That’s the problem, a lot of CS professors never worked in the industry or did anything outside academia so they never learned those lessons…or the last time they did work was back in the 90s lol.
Doesn’t help that most universities don’t seem to offer “software engineering” degrees and so everyone takes “computer science” even if they don’t want to be a computer scientist.
I fully agree about the damage done at universities. I also fully agree about the teaching professors being out of the game too long or never having been at a level which would be worth teaching to other people. A term which I heard from William Kenned first is 'mechanical sympathy'. IMHO this is the big missing thing in modern CS education. (Ok, add to that the missing parts about proper OOP, proper functional programming and literally anything taught to CS grads but relational/automata theory and mathematics (summary: mathematics) :-P). In the end I wouldn't trust anyone who cannot write Assembler, C and knows about Compiler Construction to write useful low level code or even tackle C++/Rust.
OOP/UML courses
Luckily, i had only one, and the crack who code-golfes in assembler did the work of us three.
This thread reminds me that most “developers” are terrible and don’t take the time to understand the language.
All of these Java developers you guys hate is the result of schools pushing out idiots. It’s not the language but rather the type of people you hire. These people will suck at writing in any language regardless of what order they try.
Agreed, good tools can be used badly. Over the years I've written Java, C++, and PHP professionally, and I've seen excellent and horrible impls in each. Today, I mostly use Java and this thread is reminding me that I need to learn a new for-fun language.
When I was in the military, the shooting instructors said they preferred training females because they haven't been trained poorly by somebody else.
EDIT: Designating recruits as male and female is the way the military does things. I don't use the terms male and female when referring to groups of humans. I felt the need to clarify since somebody already took offense.
OOP does things to a person
a PersonImpl, you mean? :P
I, too, would like the winter winds to teach me about Rust.
Man if I were in the US I'd apply for that job in a heartbeat, looks like that was written by a head dev who actually knows what he's talking about rather than some recruiter
That's really interesting. Maybe it's like @nxtsuda@lemmy.world said. For a lot of folks, OOP was the way we learned and operated for years
Could they have just asked it differently? Or do they just have Java hate.
It's obviously an embedded role. Java and its developers are notorious for throwing memory and compute usage out the window.
Ool about it. Where does the java hate come from?
OOP is fine. It's particularly Java culture that's terrible.
I never want to see the word Factory in a class name ever again.
When a Java dev writes in any other language, you can tell. Too many layers of abstraction is a key indicator. They make simple problems complex.
I once inherited a C# website project from a Java dev. I couldn't even figure out how to modify the CSS. And I'm a C# dev.
Factories can be good in moderation. If you make factories for every class, maybe you need to rethink your practices.
lol, last time I switched jobs some years ago I did the same but in the other side, I had a side small section with level of expertise on programming languages and explicitly added java with 1/10 to send a clear message xD
(is not that radical giving that I've been a embedded/graphics programmer most of my career, but still, funnier than not mentioning it)
I've seen horrible messes made in all of the languages listed above, it doesn't matter anymore
I actually have a ton of professional Java experience and have done a lot of microcontroller stuff of late (for fun mainly) and if you go at doing software for ARM Cortex-M microcontrollers the Java way you're going to end with overengineered bloatware.
It's however not a Java-only thing: most of those things have too little memory and processing resources for designing the whole software in a pure OO way, plus you're pretty much coding directly on the low-level (with at most a thin Hardware Abstraction Layer between your code and direct register manipulation) so only ever having used high-level OO languages isn't really good preparation for it, something which applies not only for people with only Java experience but also for those whose entire experience is with things like C#.Net as well as all smartphone frameworks and languages (Objective-C, Kotlin, Swift).
I used to write a lot of performance-critical Java (oxymoron I know) for wearables, and one time I got a code reviewer who only did server-side Java, and the differences in our philosophies were staggering.
He wanted me to convert all my code to functional style, using optionals and streams instead of simple null checks and array iterations. When explained that those things are slower and take more memory it was like I was speaking an alien language. He never even had to consider that code would be running on a system with limited RAM and CPU cycles, didn't even understand how that was possible.