Now, include into the mix the fact that the manager probably does not have a good idea how long the task "ought" to take. No one has done it before. It's new work. It's research. There is no "routine." Yes, if you've seen it done the same way 20 times before and the average was about a month, then you know to say "have it for me in a month." But those metrics are difficult to gain in the technical realm.
Curiously, my workplace has a clever way to gain them. The developers say roughly how big a task is, within an order of magnitude, and such a quantity is not strictly time based. Management can then look at a past history based on such (totally arbitrary) quantities and make remarkably accurate predictions about speed, even to the point of predicting how much "unforeseen" work will need to be done. In the end, no one specifies a deadline, but instead product management specifies how much can be done in how much time, and asks the vested business interests (marketing, customer relations, etc.) to prioritize based on the relative costs. That result gets returned to development and we just make stuff happen based on that feedback, and because of the product management intermediary, everyone feels like they have a good feel for how long various tasks/products will take.
However, my workplace aside, the usual tactic is to ask the (engineer) how long it will take to do something, and the engineer will usually come back with some (time-padded) estimate. The manager then pads it a bit more (because engineers never pad enough), and that guess is roughly how long it takes. This method tends not to work with micromanaging managers, however: they'll come in halfway through and change their minds on half a dozen things, and then wonder why the product wasn't done on time, not realizing that they reset several processes back to square one.