top of page

“No one is submitting their best work”: What nursing students told us about their clinical rotations

Stories From the Streets



We didn’t start with a thesis. We started with a conversation.


A few weeks ago, we sat down with two nursing students at a local university, both in the middle of clinical rotations at area hospitals. We asked each of them to walk us through what a typical clinical day looks like.


What followed was about an hour of honest, specific storytelling. We’re sharing because we think these voices are worth hearing.


“I haven’t gotten any feedback yet. None.”


Both students are completing 12-hour clinical shifts on med-surg floors at area hospitals. When we asked whether either of them had received documented feedback from their current rotation, the answer was the same:


No.


One student described her program’s Clinical Evaluation Tool, built in Google Sheets. Students fill out their portion within 24 hours after every shift. Instructors are then supposed to respond with their own assessment.


She laughed a little when she explained what usually happens.


“My first semester, our instructor got called out for not responding and then went back and filled everything out after the semester was already over.”


The other student described the same dynamic:


“Most instructors don’t fill it out until the end of the semester. So by the time you get anything, it doesn’t really help.”


Both students said feedback, when it does arrive, tends to focus on how they write care plans, not on their clinical skills or patient interactions. One noted she’s never received documented feedback on a hands-on skill, even after being observed performing one.


“Everyone knows the system isn’t good”


When we asked whether the issue was the instructor or the evaluation system itself, one student paused, then said something that stuck with us:


“I think everyone knows the system they’re using isn’t good, so no one really wants to engage with it.”


“No one is submitting their best work.”


The other student echoed this. Her program’s evaluation tool feels like extra work for both sides: students filling out forms that don’t get read, instructors expected to recall details from shifts that happened weeks ago. Neither side finds it useful, and neither side is pretending otherwise.


What both students said they’d want is simpler: quick, structured feedback tied to specific moments: a checklist after a skill, a short note from an instructor at the end of a shift. Something timely enough to be actionable.


“It just needs to be easy for them too,” one student said about her instructors. “If it’s too much work, they won’t do it. And I get that.”


“You either get lucky, or you don’t”


When we asked about hands-on skill exposure, both students described the same dynamic independently.


For major clinical skills (IV insertions, wound care, catheter insertions), whether a student gets to practice during a rotation comes down largely to timing and circumstance. Which patients are admitted that day. Which nurse they’re paired with.


“One student might do three catheter insertions in a single day. Someone else might not get the chance at all that semester.”


One student’s program does have a skills tracker (another tab in the same Google Sheets document) where an instructor initials when a student completes a skill. But in practice, there’s no mechanism behind it. No way to flag who still needs an opportunity. No easy way for an instructor to remember, a week later, who should be next.


With clinicals spaced about a week apart this semester, the gaps compound. By the time an instructor might notice an uneven distribution of skill opportunities, the window for a particular patient presentation may have already closed.


“So many different logins”


Between the school’s Google Site for placements, the Google Sheets evaluation tool, a separate credentialing platform, hospital orientation modules, and the hospital’s patient records system, students are managing a fragmented web of tools that don’t talk to each other. Feedback, skills tracking, and credentials each live somewhere different.


She said she wished there were just one place where everything lived, for her sake and for her instructors.


What the data gap in clinical rotations cost


What these two students described isn’t a motivation problem. It isn’t a staffing problem. It’s a documentation and visibility problem. And it has real consequences.


When feedback disappears, instructors can’t identify who needs support before it’s too late. Programs can’t demonstrate outcomes to accreditors or health system partners. And students graduate without a documented record of what they’ve learned to do.


The students we spoke with understood this intuitively. They weren’t asking for more. They were asking for better structure around what already happens on the floor every day. Quick notes. Timely prompts. Something that makes capturing a clinical moment as easy as the moment itself.


That’s the gap clinical education technology should be closing.


What closing that gap looks like


The Whitecoat Learning Platform was built around the exact friction points these students described.


Preceptors can document feedback directly on a mobile device, in under a minute, tied to a specific clinical moment rather than reconstructed at the end of the semester. Skills tracking is built in, with clear visibility into who has completed a given skill and who still needs the opportunity. That means distribution gets managed intentionally, not left to whichever patients happened to be admitted that day.


At-risk learners are flagged in real time, before a semester closes and the window for intervention passes. And because rotations, competency assessments, credentials, and feedback all live in a single platform, students and instructors aren’t managing five separate logins to piece together a picture that should already be whole.


For programs, the result is a continuous, documented record of how each learner is developing across every site and preceptor. Not a semester-end summary. A real-time picture of readiness, built incrementally across the full rotation experience, that programs can act on and that health system partners can trust.


Why we’re sharing this


These two students are working hard. They show up for 12-hour shifts, submit their paperwork on time, and are genuinely trying to learn. They talked openly with us because they wanted their experience to be useful to someone.

Stories from the Street is an ongoing series where we sit down with students, clinical instructors, and program leaders to hear what clinical education looks like from the inside. If you’re a nursing student, instructor, or program director with a story of your own, we’d love to hear it.


About Whitecoat

Whitecoat is an all-in-one clinical education platform that helps programs manage rotations, assessments, and competency tracking in one place. It provides learners with real-time feedback, gives educators a simple way to document performance, and surfaces at-risk signals and readiness data that programs can act on, and that health system partners can trust. The result is a clearer path from clinical training to workforce-ready hire.

bottom of page