One of the simplest semantic representation of text would be transforming text into propositions:
If you get stressed or you don't eat well, then you get ill.
could be represented as:
get_stressed ∨ ¬eat_well → get_ill
in logic.
A sub-field called Semantic Role Labeling of NLP (with many rich resources propBank, verbNet, frameNet ...) is something you may find useful to look at, also. Especially, representing verbs with their roles, agents, direct-indirect objects or prepositions, sentences are represented as graphs serving semantics. Jurafsky has a chapter for SLR in chapter 18 in his book here : Chapter 18 - Semantic Role Labeling and Argument Structure
As in Jurafsky's sample, a verb's arguments could be extracted in a sentence:
John[AGENT] broke the window[THEME] with a rock[INSTRUMENT].
and could be simply represented like: break(John, the window, a rock)
Complete Book link
Hope this helps,
Cheers