This is exactly what we are doing in our company, thank you! We are trying to formalize it since no process existed, and it bears striking resemblance to the outline here. Having been in the industry almost 30 years now, I have strong opinions from lessons learned on this topic. One lesson would be to define the number of reviews the SME has and give guidance on what they should focus during the review, else they take every opportunity to make substantive changes up to the very last second. For example, early reviews can have lots of wordsmithing and content changes, but the last review should keep that to a minimum so you can focus on making the course work as intended. I especially make it clear when the technical reviewer should focus on "is the text accurate" instead of changing design elements that might be part of a corporate template, the tool design, or best learning practice. I make sure they understand the limitations of the tool we are using so they realize when they can't add green boxes or curly Qs. Right now this communication is in an email, but I am adding it to an SLA. Another lesson learned is that I hold a meeting to show them what the options are for the solution (if they are given options) and what the tool can do, so they understand upfront any restrictions. It also helps demonstrate the complexity of the software, all the settings that need to be perfect, and how their last minute change can't be done in 30 seconds.