Hollywood's diversity isn't real "diversity". Are the movies still set in the US? Do they still feature American protagonists saving the world (black, white, what-the-****-ever, are they still American citizens)? Do they portray the US as the center of the world? Do they tackle all these issues thrugh American lens? Do they ignore the rest of the world?
The answer to all these questions is a resounding yes. As long this keeps happening, making Batman black isn't championing for diversity. It's the same old crap, but they're riding the social media push "diversity". If they cared, they'd take their white characters, relocate them across the world, stop making the US the centre of the world, etc, etc. But they won't do that, so we're left with them taking white characters, turning them black and then saying "woo boy, look how progressive we are", all the while ignoring the actual black characters.
TL;DR: Hollywood/Comic Diversity is nothing more than a fake push, as it's still US-Centric and doesn't change jack-**** besides the character's colour.