Hey everyone — I’m a final-year student, and I’ve been wondering this a lot lately. We always hear that “you need a good project to land a job”, but most students I know either copy from GitHub, get stuck, or just… give up. We’re doing a small open survey to understand this from both sides — students and educators. If you’ve ever: Built or struggled with a final-year project
Helped someone else do it (educator/mentor)
Wanted to sell or learn from real-world projects
We’d love to hear your honest experience. 🙏 It’s just 2–3 mins, totally anonymous. 📄 Survey Link – for students & educators
We’ll be using the insights to create open resources and maybe a system that actually helps. Thanks in advance if you participate — or drop a comment about your experience.
Rather than writing something from scratch it would make more sense to maintain an existing application stewarded by the university. There are some universities that work this way but even then they don’t have enough projects for everyone to work on
Idk which market you are in. In mine no one cares about your project. Most of companies don’t do innovation. Most of real life projects I saw just moves data around. When I interview people all I care about is knowledge of tech stack and what I call “analytical thinking”.
Students generally don’t have real-world problems that need solving. I think pretending they do makes a lot of assumptions about their life, hobbies, free time…
It’s much much much more important to have a co-op program. Everything practical I learned in university was through my co-op work terms.
I am responsible for hiring some devs right now, and there’s been a wide spectrum of competence from people who have “real projects”. Especially with how prevalent AI is, people can literally just talk to an AI agent and get some kind of app/website spun up with 0 skill and effort. What I am always looking for is people who can work on a team in an existing codebase.
time
it takes quite a big bunch of time to craft anything even remotely useful - the golden era of extreme java programming has long passed
It’s the real world part.
I have written a SMTP-Server in C and some crud services in java, but everything ran on localhost.
If I wanted to expose them on the web I would have to ensure the code is actually secure, ensure compliance with data protection regulations, moderate user generated content, get ssl certificates, a domain and server hosting.
Half those steps are one minor mistake away from a large bill.Yeah, it’s easy to underestimate how big of a leap it is from a toy application to real-world usability. Not just in terms of security, but also:
- useful error messages
- logging / monitoring
- configuration
- building a distribution
- deploying in a reproducible way
- documentation
- integration with existing infrastructure
- data migration strategies
- etc.
This adds a lot of complexity, so you’ll need to learn additional complexity to be able to deal with it at all:
- modularization
- version control systems
- software specifications (via unit/integration tests)
- team communication
- helper tooling, like package managers, linters etc.
Learning about all this stuff takes years, especially if no one in your surroundings has much experience with any of it either. Professors don’t have the time to gain or retain this experience, since they already have a different full-time job.
My advice would be to get students to do internships or to take a job as a working student in a company/organization. Sometimes, these can be shitty for the students, but they can often provide significantly more real-world context than college ever could.
It’s a Google Docs link…