Sounds like the process is a little out of order. If information is pushing the design, it can be really difficult to design a solid product that actually does anything but carry information in buckets. And that's not a great situation to be in. Outcome expectations should drive solution decisions. The biggest question: What problem(s) are we solving today? If the answer is "I dunno", it's faster, cheaper and easier on your learners to put money that would have been spent straight into the paper shredder.
Here's a short presentation that shows our revised high level process (the conceptual sequencing shows up on slide 38). We do have a set of worksheets to capture artifacts and outputs at various stages. These are really simple (even if the process to discover the artifacts that go into them is not). Worksheets are attached.
Fortunately, with some minor process changes, I think you can still give the writer the impression that they are contributing energy early and avoid over-the-fence syndrome. By starting with performance and business requirements (a short paragraph describing what the organization wants to accomplish or realize and a short list of tasks the participant needs to perform) and following with an examination or breakdown of the covert tasks (this is where the power of digital solutions lives) and then looking at practice opportunities for the tasks identified you're really focusing on what people DO, not what they need to know -- yet. Content excavation doesn't start until we get near the end of the cycle shown in the graphic referenced above (just starting to look at content / information when we're outlining objectives and assessment items).
I've attached the worksheets we use. Again, really simple. But it adds formality to the process and helps to formulate outcome before we start talking about outputs. Outputs without consideration of outcomes... that's what sucks the success out of a product. The GOTS/COTS is done up front as a baseline market search. It's not intended to stop there.
We're working on a few additional worksheets to further tune the process. One thing we often see is a lack of learning problem definition. In other words, what's hard about learning this concept? Further breaking this down into a particular classification of learning problem can be really helpful when matching media and methods. Too often, media selection and delivery method selections are completely arbitrary. Basing these decisions on "gut" or "this is the way other people do it" leaves too much to chance, in my opinion. Science can make the arbitrary leaps much smaller, reducing the risk that a pile of steamy poo will be the only tangible outcome.
Examples for things that would go into these worksheets can be found here in our SOP ( See Appendix E).