There is two parts to this:
- Parsing the link syntax.
- Generating a URL from a page title.
Parsing the link is done by the Parser class, which does all wikitext processing. Links can be more complex that just [[Foo]], e.g. [[Foo#bar|something]], where Foo is the target page, #bar is a section, and "something" is the "surface text" of the resulting link. For generating a URL, you need the target page and section. To generate a link (in HTML), you also need the surface text (if not given, it's the same as the target page + section). There is currently no easy way to just parse a link, without full wikitext processing.
If you already have the target page title (and maybe a section id), you have two choices of getting a URL for it: the old school (monolithic) way, or the new style (service based) way.
The new style method is:
- Use MediaWikiTitleCodec::parseTitle to generate a TitleValue object from the string(s) you have.
- Use MediaWikiPageLinkRenderer::getPageURL to get the page URL for a given TitleValue object.
The old school method is:
- Use Title::newFromText() to creae a Title object from the string(s) you have.
- Use Title::getFullURL() to get the page URL.
The old style is easier, because it doesn't require you to create service objects. It relies on global state and has all dependencies hardcoded. The new style uses simple dependency injection, which allows all parts to be swapped out individually and be tested independently - but it requires a little more code.