I'll break down your question in two:
1. Can events being streamed via kafka can be indexed in ElasticSearch
Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.
You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html
2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka
The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)
You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/
And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html
Conclusion
In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.
A nice illustration of such setup (taken from Confluent) would be:
