I love how I got the Deus Ex banner for this thread. It's my favorite video game.
All the things you list have the potential to be good, but in our current society I expect will lead to nightmares. Part of the issue at hand is that our society treats its people like tools (specifically, too many people in power treat others as tools for their own benefit in order to acquire money, power, or more generally to further their capacity for domination); this is what transhumanism may turn us into if not controlled. Also, those who outlive their usefulness may be left behind.
Genetic engineering is an issue in a competitive landscape because it may influence much more than just intelligence or appearance; it may also be used to influence compliance and political/philosophical tendencies. The outcome wil depend on what sort of selective pressures exist; in our modern world they would probably be intelligence, beauty, complacency, compliance, and conformity. "Well-adjustness," in a sense. Also, it may not be possible for people who don't want to participate to reasonably escape, and being able to opt-out and get shitstomped by everyone else, dying and leaving no children doesn't count.
Mechanical augmentation suffers from the same problem.
I would like to know how you think ambition and self interest are going to be removed from government. Even if they are, the ruling caste will have to follow some sort of guidelines for managing humanity. Ideally they will help their subjects live fulfilling lives, but that is hardly a guarantee. Things like naive utilitarianism could be particularly scary.
Humanity won't have much of a reason to pursue science, as the machines will be much better at it. They'll be better at art too, but people will probably find something in it anyway.
Why do you want to live in this society? Even at best it seems like sticking around after you've won a video game, i.e., fucking boring.
I'm not quite sure what the last thing you mention refers to.
Anyway, my concern is that too many people wouldn't want this and especially wouldn't want to be forced to be a part of it. How do you expect to deal with them?
Personally, I believe one of the better courses of action is to have a benevolent effectively omnipotent overseeing intelligent construct which manages the maintenance of certain negative freedoms for nonascended humanity (the origin of said construct may either be from strong AI or partial replication of a particular human personality). If desired, people may attach their conscious experience to the construct, in effect operating as a weakly-centralized hivemind and gaining the benefit of the collected knowledge of humanity. In the long run, humanity may consent to coalesce into a single consciousness and explore the stars/do galactic-scale science/chill out and wait for the invasion by other constructs from neighboring galaxies. This is similar in concept to the short story The Last Question by Isaac Asimov.
Keep in mind that time will pass very slowly if intelligent constructs are run at full speed (at least in a sense) and would likely drive you batshit insane in very short order (1 second = millenia of thought).
I know some of the people on the rationalist/positivist blogs like to discuss the problems of transhumanism in much greater depth than people here. One of them is Slate Star Codex; if you're interested, check out their archives.