That undergraduate-level education is definitely a couple of levels above the high-school level most people are taught, and is difficult enough that most people wouldn't be able to do it, but is still itself somewhere around mass-produced kindergarten-level for the people who work in those fields and are actually finding out genuinely new things.
Part of it is looking at how far back in history a piece of knowledge was discovered. High school - things like math and science in particular - teach stuff which has been known for centuries, if not millennia. Undergraduate classes might bring you up to only one or two centuries ago. Postgraduate work may allow you to dip into things which (sometimes) were discovered in living memory. It's really only with doctorate-level and postdoc work that you get a chance to work with things from the present generation.
Other areas, like media/culture (and sometimes business), have a far greater chance of referencing things from the past few years. Even then, there's often a huge swathe of things from previous centuries to learn as a base for most of it.
Also: formal education with its nigh-unyielding curriculum schedules may not be the greatest learning environment for people whose capacities vary significantly due to medical, financial, or circumstantial issues. With the ability to shift much of the learning to digital sources and interactivity, I wonder if at least low-level testing could be partially automated enough to allow it to take place more often, allowing students to self-pace more easily. It's not just autistics who have good and bad days, or areas of focus - it's people who have all kinds of medical conditions, varying levels of pain, and have things like employment or family duties to try and work around. It'd be nice to be able to self-pace learn for pretty much the entire educational track, kindergarten through postgrad, via the internet, mix remote learning with in-person options on an as-needed basis, and take tests/exams on a more personal schedule.