To be able to answer the question, if complex robotic systems should be given rights, we need to make clear, what we think is the requirement for an individuum to get rights. The first possibility is the one that was also referenced above, that something that is able to feel pain or something very similar should be granted rights because of its ability to suffer from that pain. This is an argumentation often used by Vegans ans Vegetarians but also by some philosophers like Hans Jonas. If we use this way of approaching the discussion even the most complex AI wouldnt need rights because they are not able to feel pain in any way. The second possibility says, that something that has thoughts, can percept itself as an idividuum and has dreams etc needs rights. If we choose this way of approach the decision is way harder because we would have to draw a line between non self-aware AIs ans self-aware AIs. (excuse my english im unfortunately not a native speaker)