My intuition is that 1) self-awareness (modeled through epistemic modal logics); 2) goal reasoning (e.g. https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/view/12292) and 3) consciousness (e.g. Integrated Information Theory) are the necessary and sufficient features for agent to be truly autonomous. Especially - consciousness is the key feature of autonomy. So - can truly autonomous (human-level autonomy) agent exist without consciousness?
Of course, I am thinking about autonomy not ultimate freedom. E.g. every agent can be autonomous but it will be bounded by 1) scarcity of resources; 2) ultimate goals and values which will be inscribed into agent by the humans.