More Universities Need to Teach Sales
Harvard Business Review
APRIL 26, 2016
Sales was traditionally seen as a form of service work, with an emphasis primarily on developing moral character. So a school could legitimately prepare a student for a business career while omitting training in sales. ”In other words, why serve hamburger when you can teach people to cook steak?
Let's personalize your content