From 1fb325eed87628774c1735083fc103a0288df14e Mon Sep 17 00:00:00 2001 From: whyoleg Date: Fri, 2 Aug 2024 15:48:39 +0000 Subject: [PATCH] Publish 2.0.20-SNAPSHOT documentation --- .../data_model/documentable_model/index.html | 3 +- .../core_extension_points/index.html | 2 +- 2.0.20-SNAPSHOT/search/search_index.json | 2 +- 2.0.20-SNAPSHOT/sitemap.xml | 28 +++++++++--------- 2.0.20-SNAPSHOT/sitemap.xml.gz | Bin 392 -> 390 bytes 5 files changed, 18 insertions(+), 17 deletions(-) diff --git a/2.0.20-SNAPSHOT/developer_guide/architecture/data_model/documentable_model/index.html b/2.0.20-SNAPSHOT/developer_guide/architecture/data_model/documentable_model/index.html index 44bd201554..a61b4a678d 100644 --- a/2.0.20-SNAPSHOT/developer_guide/architecture/data_model/documentable_model/index.html +++ b/2.0.20-SNAPSHOT/developer_guide/architecture/data_model/documentable_model/index.html @@ -1458,7 +1458,8 @@

Documentable ModelSourceToDocumentableTranslatorDocumentable model.

For reference, see

DocumentableMerger

diff --git a/2.0.20-SNAPSHOT/search/search_index.json b/2.0.20-SNAPSHOT/search/search_index.json index 559456aeae..4c93cc62b2 100644 --- a/2.0.20-SNAPSHOT/search/search_index.json +++ b/2.0.20-SNAPSHOT/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Dokka","text":"

Dokka is an API documentation engine for Kotlin.

If you want to learn how to use Dokka, see documentation on kotlinlang.org.

If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides.

"},{"location":"developer_guide/introduction/","title":"Developer guides","text":"

The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself.

If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow: it will teach you how to build, debug and test Dokka locally.

CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka.

If you want to get into plugin development quick, see Introduction to plugin development.

If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals.

Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it.

If you have any questions, feel free to get in touch with maintainers via Slack or GitHub.

"},{"location":"developer_guide/workflow/","title":"Workflow","text":"

Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do:

  1. How to build Dokka or a plugin
  2. How to use/test locally built Dokka in a project
  3. How to debug Dokka or a plugin in IntelliJ IDEA

We'll go over each step individually in this section.

Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL, but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish.

"},{"location":"developer_guide/workflow/#build-dokka","title":"Build Dokka","text":"

Building Dokka is pretty straightforward:

./gradlew build\n

This will build all subprojects and trigger build in all composite builds. If you are working on a runner, you can build it independently.

Checks that are performed as part of build do not require any special configuration or environment, and should not take much time (~2-5 minutes), so please make sure they pass before submitting a pull request.

"},{"location":"developer_guide/workflow/#troubleshooting-build","title":"Troubleshooting build","text":""},{"location":"developer_guide/workflow/#api-check-failed-for-project","title":"API check failed for project ..","text":"

If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API.

If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational.

"},{"location":"developer_guide/workflow/#use-test-locally-built-dokka","title":"Use / test locally built Dokka","text":"

Having built Dokka locally, you can publish it to mavenLocal(). This will allow you to test your changes in another project as well as debug code remotely.

  1. Publish a custom version of Dokka to Maven Local: ./gradlew publishToMavenLocal -Pversion=1.9.20-my-fix-SNAPSHOT. This version will be propagated to plugins that reside inside Dokka's project (mathjax, kotlin-as-java, etc), and its artifacts should appear in ~/.m2
  2. In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository:
    repositories {\n   mavenLocal()\n}\n
  3. Update your Dokka dependency to the version you've just published:
    plugins {\n    id(\"org.jetbrains.dokka\") version \"1.9.20-my-fix-SNAPSHOT\"\n}\n

After completing these steps, you should be able to build documentation using your own version of Dokka.

"},{"location":"developer_guide/workflow/#debugging-dokka","title":"Debugging Dokka","text":"

Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin.

Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin.

  1. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle, there's both simple single-module and more complex multi-module / multiplatform examples.
  2. For the debug project, set org.gradle.debug to true in one of the following ways:

    • In your gradle.properties add org.gradle.debug=true
    • When running Dokka tasks:./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon
  3. Run the desired Dokka task with --no-daemon. Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon.

  4. Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process

Note

The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again.

If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon

In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds.

"},{"location":"developer_guide/workflow/#run-integration-tests","title":"Run integration tests","text":"

Dokka's integration tests help check compatibility with various versions of Kotlin, Android, Gradle and Java. They apply Dokka to real user-like projects and invoke Gradle / Maven / CLI tasks to generate the documentation.

Integration tests require a significant amount of available RAM (~20-30GB), take 1+ hour and may require additional environment configuration to run. For these reasons, it's not expected that you run all integration tests locally as part of the everyday development process, they will be run on CI once you submit a PR.

However, if you need to run all integration tests locally, you can use the integrationTest task:

./gradlew integrationTest\n

If you need to run a specific test locally, you can run it from your IDE or by calling the corresponding Gradle task (for example, :dokka-integration-tests:gradle:testExternalProjectKotlinxCoroutines).

"},{"location":"developer_guide/architecture/architecture_overview/","title":"Architecture overview","text":"

Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine.

However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded.

For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level.

"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-data-model","title":"Overview of data model","text":"

Generating API documentation begins with input source files (.kt, .java, etc) and ends with some output files (.html/.md, etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model.

Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage.

flowchart TD\n    Input --> Documentables --> Pages --> Output

You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else.

For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on.

For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content

For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions.

"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-extension-points","title":"Overview of extension points","text":"

An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point.

You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API.

Here's a sneak peek of the DSL:

// declare your own plugin\nclass MyPlugin : DokkaPlugin() {\n    // create an extension point for developers to use\n    val signatureProvider by extensionPoint<SignatureProvider>()\n\n    // provide a default implementation\n    val defaultSignatureProvider by extending {\n        signatureProvider with KotlinSignatureProvider()\n    }\n\n    // register our own extension in Dokka's Base plugin by overriding its default implementation\n    val dokkaBasePlugin by lazy { plugin<DokkaBase>() }\n    val multimoduleLocationProvider by extending {\n        (dokkaBasePlugin.locationProviderFactory\n                providing MultimoduleLocationProvider::Factory\n                override dokkaBasePlugin.locationProvider)\n    }\n}\n\nclass MyExtension(val context: DokkaContext) {\n\n    // use an existing extension\n    val signatureProvider: SignatureProvider = context.plugin<MyPlugin>().querySingle { signatureProvider }\n\n    fun doSomething() {\n        signatureProvider.signature(..)\n    }\n}\n\ninterface SignatureProvider {\n    fun signature(documentable: Documentable): List<ContentNode>\n}\n\nclass KotlinSignatureProvider : SignatureProvider {\n    override fun signature(documentable: Documentable): List<ContentNode> = listOf()\n}\n

For a deeper dive into extensions and extension points, see Introduction to Extensions.

For an overview of existing extension points, see Core extension points and Base extensions.

"},{"location":"developer_guide/architecture/architecture_overview/#historical-context","title":"Historical context","text":"

This is a second iteration of Dokka that was built from scratch.

If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity. The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.

"},{"location":"developer_guide/architecture/data_model/documentable_model/","title":"Documentable Model","text":"

The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth.

By default, the documentables are created from:

Code-wise, you can have a look at following classes:

Upon creation, the documentable model represents a collection of trees, each with DModule as root.

Take some arbitrary Kotlin source code that is located within the same module:

// Package 1\nclass Clazz(val property: String) {\n    fun function(parameter: String) {}\n}\n\nfun topLevelFunction() {}\n\n// Package 2\nenum class Enum { }\n\nval topLevelProperty: String\n

This would be represented roughly as the following Documentable tree:

flowchart TD\n    DModule --> firstPackage[DPackage]\n    firstPackage --> DClass\n    firstPackage --> toplevelfunction[DFunction] \n    DClass --> DProperty\n    DClass --> DFunction\n    DFunction --> DParameter\n    DModule --> secondPackage[DPackage]\n    secondPackage --> DEnum\n    secondPackage --> secondPackageProperty[DProperty]

At later stages of transformation, all trees are folded into one by DocumentableMerger.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable","title":"Documentable","text":"

The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction, DPackage, DProperty, and so on.

DClasslike is the base class for all class-like documentables, such as DClass, DEnum, DAnnotation and others.

The contents of each documentable normally represent what you would see in the source code.

For example, if you open DClass, you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific).

Here's an example of a documentable:

data class DClass(\n    val dri: DRI,\n    val name: String,\n    val constructors: List<DFunction>,\n    val functions: List<DFunction>,\n    val properties: List<DProperty>,\n    val classlikes: List<DClasslike>,\n    val sources: SourceSetDependent<DocumentableSource>,\n    val visibility: SourceSetDependent<Visibility>,\n    val companion: DObject?,\n    val generics: List<DTypeParameter>,\n    val supertypes: SourceSetDependent<List<TypeConstructorWithKind>>,\n    val documentation: SourceSetDependent<DocumentationNode>,\n    val expectPresentInSet: DokkaSourceSet?,\n    val modifier: SourceSetDependent<Modifier>,\n    val sourceSets: Set<DokkaSourceSet>,\n    val isExpectActual: Boolean,\n    val extra: PropertyContainer<DClass> = PropertyContainer.empty()\n) : DClasslike(), WithAbstraction, WithCompanion, WithConstructors,\n    WithGenerics, WithSupertypes, WithExtraProperties<DClass>\n

There are three non-documentable classes that are important for this model:

"},{"location":"developer_guide/architecture/data_model/documentable_model/#dri","title":"DRI","text":"

DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable. All references and relations between the documentables (other than direct ownership) are described using DRI.

For example, DFunction with a parameter of type Foo only has Foo's DRI, but not the actual reference to Foo's Documentable object.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#example","title":"Example","text":"

For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines:

package kotlinx.coroutines\n\nimport ...\n\npublic abstract class MainCoroutineDispatcher : CoroutineDispatcher() {\n\n    override fun limitedParallelism(parallelism: Int): CoroutineDispatcher {\n        ...\n    }\n}\n

If we were to re-create the DRI of this function in code, it would look something like this:

DRI(\n    packageName = \"kotlinx.coroutines\",\n    classNames = \"MainCoroutineDispatcher\",\n    callable = Callable(\n        name = \"limitedParallelism\",\n        receiver = null,\n        params = listOf(\n            TypeConstructor(\n                fullyQualifiedName = \"kotlin.Int\",\n                params = emptyList()\n            )\n        )\n    ),\n    target = PointingToDeclaration,\n    extra = null\n)\n

If you format it as String, it would look like this:

kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#sourcesetdependent","title":"SourceSetDependent","text":"

SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets.

This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect, or code comments written for expect might be different from what's written for actual.

Under the hood, it's a typealias to a Map:

typealias SourceSetDependent<T> = Map<DokkaSourceSet, T>\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#extraproperty","title":"ExtraProperty","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

This element is a bit more complex, so you can read more about how to use it in a separate section.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentation-model","title":"Documentation model","text":"

The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs).

"},{"location":"developer_guide/architecture/data_model/documentable_model/#doctag","title":"DocTag","text":"

DocTag describes a specific documentation syntax element.

It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and <b>bold</b> in Java.

However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use.

DocTag elements can be deeply nested with other DocTag children elements.

Examples:

data class H1(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class H2(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class Strikethrough(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class Strong(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class CodeBlock(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : Code()\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#tagwrapper","title":"TagWrapper","text":"

TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return.

Since each such section may contain formatted text inside it, each TagWrapper has DocTag children.

/**\n * @author **Ben Affleck*\n * @return nothing, except _sometimes_ it may throw an [Error]\n */\nfun foo() {}\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentationnode","title":"DocumentationNode","text":"

DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable, usually used like this:

data class DFunction(\n    ...\n    val documentation: SourceSetDependent<DocumentationNode>,\n    ...\n)\n
"},{"location":"developer_guide/architecture/data_model/extra/","title":"Extra","text":""},{"location":"developer_guide/architecture/data_model/extra/#introduction","title":"Introduction","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

ExtraProperty classes are available both in the Documentable and the Content models.

To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras:

data class CustomExtra(\n    [any data relevant to your extra], \n    [any data relevant to your extra] \n): ExtraProperty<Documentable> {\n    override val key: CustomExtra.Key<Documentable, *> = CustomExtra\n    companion object : CustomExtra.Key<Documentable, CustomExtra>\n}\n

Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets, when the documentables being merged have their own Extra of the same type.

"},{"location":"developer_guide/architecture/data_model/extra/#propertycontainer","title":"PropertyContainer","text":"

All extras for ContentNode and Documentable classes are stored in the PropertyContainer<C : Any> class instances.

data class DFunction(\n    ...\n    override val extra: PropertyContainer<DFunction> = PropertyContainer.empty()\n    ...\n) : WithExtraProperties<DFunction>\n

PropertyContainer has a number of convenient functions for handling extras in a collection-like manner.

The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable.

"},{"location":"developer_guide/architecture/data_model/extra/#usage-example","title":"Usage example","text":"

In following example we will create a DFunction-only extra property, store it and then retrieve its value:

// Extra that is applicable only to DFunction\ndata class CustomExtra(val customExtraValue: String) : ExtraProperty<DFunction> {\n    override val key: ExtraProperty.Key<Documentable, *> = CustomExtra\n    companion object: ExtraProperty.Key<Documentable, CustomExtra>\n}\n\n// Storing it inside the documentable\nfun DFunction.withCustomExtraProperty(data: String): DFunction {\n    return this.copy(\n        extra = extra + CustomExtra(data)\n    )\n}\n\n// Retrieveing it from the documentable\nfun DFunction.getCustomExtraPropertyValue(): String? {\n    return this.extra[CustomExtra]?.customExtraValue\n}\n

You can also use extras as markers, without storing any data in them:

object MarkerExtra : ExtraProperty<Any>, ExtraProperty.Key<Any, MarkerExtra> {\n    override val key: ExtraProperty.Key<Any, *> = this\n}\n\nfun Documentable.markIfFunction(): Documentable {\n    return when(this) {\n        is DFunction -> this.copy(extra = extra + MarkerExtra)\n        else -> this\n    }\n}\n\nfun WithExtraProperties<Documentable>.isMarked(): Boolean {\n    return this.extra[MarkerExtra] != null\n}\n
"},{"location":"developer_guide/architecture/data_model/page_content/","title":"Page / Content Model","text":"

Even though the Page and Content models reside on the same level (under Page), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only.

"},{"location":"developer_guide/architecture/data_model/page_content/#page","title":"Page","text":"

The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file.

The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as (.html, .md, etc), and how, is up to the Renderer extension.

Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage, PackagePage, ClasslikePage, MemberPage and so on.

The Page model can be represented as a tree, with RootPageNode at the root.

Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property:

flowchart TD\n    RootPageNode --> firstPackage[PackagePageNode]\n    RootPageNode --> secondPackage[PackagePageNode]\n    RootPageNode --> thirdPackage[PackagePageNode]\n    firstPackage --> firstPackageFirstMember[MemberPageNode - Function]\n    firstPackage --> firstPackageSecondMember[MemberPageNode - Property]\n    firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class]\n    firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function]\n    firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property]\n    secondPackage --> etcOne[...]\n    thirdPackage --> etcTwo[...]

Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it.

"},{"location":"developer_guide/architecture/data_model/page_content/#content-model","title":"Content Model","text":"

The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal.

For an example, have a look at the subclasses of ContentNode: ContentText, ContentList, ContentTable, ContentCodeBlock, ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style.

// real example of composing content using the `DocumentableContentBuilder` DSL\norderedList {\n    item {\n        text(\"This list contains a nested table:\")\n        table {\n            header {\n                text(\"Col1\")\n                text(\"Col2\")\n            }\n            row {\n                text(\"Text1\")\n                text(\"Text2\")\n            }\n        }\n    }\n    item {\n        group(styles = setOf(TextStyle.Bold)) {\n            text(\"This is bald\")\n            text(\"This is also bald\")\n        }\n    }\n}\n

It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json).

For instance, HtmlRenderer might render ContentCodeBlock as <code>text</code>, but CommonmarkRenderer might render it using backticks.

"},{"location":"developer_guide/architecture/data_model/page_content/#dci","title":"DCI","text":"

Each node is identified by a unique DCI, which stands for Dokka Content Identifier.

DCI aggregates DRIs of all documentables that are used by the given ContentNode.

data class DCI(val dri: Set<DRI>, val kind: Kind)\n

All references to other nodes (other than direct ownership) are described using DCI.

"},{"location":"developer_guide/architecture/data_model/page_content/#contentkind","title":"ContentKind","text":"

ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page.

For example, on the same page that describes a class you can have multiple sections (== ContentKinds). One to describe functions, one to describe properties, another one to describe the constructors, and so on.

"},{"location":"developer_guide/architecture/data_model/page_content/#styles","title":"Styles","text":"

Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way.

group(styles = setOf(TextStyle.Paragraph)) {\n    text(\"Text1\", styles = setOf(TextStyle.Bold))\n    text(\"Text2\", styles = setOf(TextStyle.Italic))\n}\n

It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as <b>text</b>, but CommonmarkRenderer might render it as **text**.

There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box:

// for code highlighting\nenum class TokenStyle : Style {\n    Keyword, Punctuation, Function, Operator, Annotation,\n    Number, String, Boolean, Constant, Builtin, ...\n}\n\nenum class TextStyle : Style {\n    Bold, Italic, Strong, Strikethrough, Paragraph, ...\n}\n\nenum class ContentStyle : Style {\n    TabbedContent, RunnableSample, Wrapped, Indented, ...\n}\n
"},{"location":"developer_guide/architecture/data_model/page_content/#extra","title":"Extra","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model.

It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions.

This element is a bit complex, so you can read more about how to use it in a separate section.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/","title":"Base plugin","text":"

DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions, as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats.

If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase, as it reduces the scope of changes you need to make.

DokkaBase is used extensively in Dokka's own output formats.

You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/#extension-points","title":"Extension points","text":"

Some notable extension points defined in Dokka's Base plugin.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/#premergedocumentabletransformer","title":"PreMergeDocumentableTransformer","text":"

PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation.

This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged.

It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal, you most likely need an implementation of PreMergeDocumentableTransformer.

For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/","title":"Core extension points","text":"

Core extension points represent the main stages of generating documentation.

These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka.

For higher-level extension functions that can be used in different output formats, have a look at the Base plugin.

You can find all core extensions in the CoreExtensions class:

object CoreExtensions {\n    val preGenerationCheck by coreExtensionPoint<PreGenerationChecker>()\n    val generation by coreExtensionPoint<Generation>()\n    val sourceToDocumentableTranslator by coreExtensionPoint<SourceToDocumentableTranslator>()\n    val documentableMerger by coreExtensionPoint<DocumentableMerger>()\n    val documentableTransformer by coreExtensionPoint<DocumentableTransformer>()\n    val documentableToPageTranslator by coreExtensionPoint<DocumentableToPageTranslator>()\n    val pageTransformer by coreExtensionPoint<PageTransformer>()\n    val renderer by coreExtensionPoint<Renderer>()\n    val postActions by coreExtensionPoint<PostAction>()\n}\n

On this page, we'll go over each extension point individually.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pregenerationchecker","title":"PreGenerationChecker","text":"

PreGenerationChecker can be used to run some checks and constraints.

For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets, and fails if it finds any.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#generation","title":"Generation","text":"

Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable.

See Generation implementations to learn about the default implementations.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#sourcetodocumentabletranslator","title":"SourceToDocumentableTranslator","text":"

SourceToDocumentableTranslator translates any given sources into the Documentable model.

Kotlin and Java sources are supported by default by the Base plugin, but you can analyze any language as long as you can map it to the Documentable model.

For reference, see

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentablemerger","title":"DocumentableMerger","text":"

DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletransformer","title":"DocumentableTransformer","text":"

DocumentableTransformer performs the same function as PreMergeDocumentableTransformer, but after merging source sets.

Notable example is InheritorsExtractorTransformer, it extracts inheritance information from source sets and creates an inheritance map.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletopagetranslator","title":"DocumentableToPageTranslator","text":"

DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples.

Output formats can either use the same page structure or define their own.

Only a single extension of this type is expected to be registered.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pagetransformer","title":"PageTransformer","text":"

PageTransformer is useful if you need to add, remove or modify generated pages or their content.

Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages.

If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#renderer","title":"Renderer","text":"

Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly.

Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#postaction","title":"PostAction","text":"

PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages.

Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.

"},{"location":"developer_guide/architecture/extension_points/extension_points/","title":"Extension points","text":"

In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#declaring-extension-points","title":"Declaring extension points","text":"

If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code.

class MyPlugin : DokkaPlugin() {\n    val sampleExtensionPoint by extensionPoint<SampleExtensionPointInterface>()\n}\n\ninterface SampleExtensionPointInterface {\n    fun doSomething(input: Input): List<Output>\n}\n\nclass Input\nclass Output\n

Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#extending-from-extension-points","title":"Extending from extension points","text":"

You can use extension points to provide your own implementations in order to customize a plugin's behaviour.

If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase), you can use plugin querying API to do that.

The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface.

class MyExtendedPlugin : DokkaPlugin() {\n\n    val mySampleExtensionImplementation by extending {\n        plugin<MyPlugin>().sampleExtensionPoint with SampleExtensionImpl()\n    }\n}\n\nclass SampleExtensionImpl : SampleExtensionPointInterface {\n    override fun doSomething(input: Input): List<Output> = listOf()\n}\n

Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself:

open class MyPlugin : DokkaPlugin() {\n    val sampleExtensionPoint by extensionPoint<SampleExtensionPointInterface>()\n\n    val defaultSampleExtension by extending {\n        sampleExtensionPoint with DefaultSampleExtension()\n    }\n}\n\nclass DefaultSampleExtension : SampleExtensionPointInterface {\n    override fun doSomething(input: Input): List<Output> = listOf()\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#providing","title":"Providing","text":"

If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead.

val defaultSampleExtension by extending {\n    sampleExtensionPoint providing { context ->\n        // can use context to query other extensions or get configuration \n        DefaultSampleExtension() \n    }\n}\n

You can read more on what you can do with context in Obtaining extension instance.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#override","title":"Override","text":"

By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other.

However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        (myPlugin.sampleExtensionPoint\n                with SampleExtensionImpl()\n                override myPlugin.defaultSampleExtension)\n    }\n}\n

This is also useful if you wish to override some extension from DokkaBase, to disable or alter it.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#order","title":"Order","text":"

Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        myPlugin.sampleExtensionPoint with SampleExtensionImpl() order {\n            before(myPlugin.firstExtension)\n            after(myPlugin.thirdExtension)\n        }\n    }\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#conditional-apply","title":"Conditional apply","text":"

If you want your extension to be registered only if some condition is true, you can use the applyIf construct:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        myPlugin.sampleExtensionPoint with SampleExtensionImpl() applyIf {\n            Random.Default.nextBoolean()\n        }\n    }\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#obtaining-extension-instance","title":"Obtaining extension instance","text":"

After an extension point has been created and some extensions have been registered, you can use query and querySingle functions to find all or just a single implementation.

class MyExtension(context: DokkaContext) {\n    // returns all registered extensions for the extension point\n    val allSampleExtensions = context.plugin<MyPlugin>().query { sampleExtensionPoint }\n\n    // will throw an exception if more than one extension is found.\n    // use if you expect only a single extension to be registered for the extension point\n    val singleSampleExtensions = context.plugin<MyPlugin>().querySingle { sampleExtensionPoint }\n\n    fun invoke() {\n        allSampleExtensions.forEach { it.doSomething(Input()) }\n\n        singleSampleExtensions.doSomething(Input())\n    }\n}\n

In order to have access to DokkaContext, you can use the providing keyword when registering an extension.

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/","title":"Generation implementations","text":"

There are two main implementations of the Generation core extension point:

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#singlemodulegeneration","title":"SingleModuleGeneration","text":"

SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish.

Below you can see the flow of how Dokka's data model is transformed by various core and base extensions.

flowchart TD\n    Input -- SourceToDocumentableTranslator --> doc1[Documentables]\n    subgraph documentables [ ]\n    doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables]\n    doc2 -- DocumentableMerger --> doc3[Documentables]\n    doc3 -- DocumentableTransformer --> doc4[Documentables]\n    end\n    doc4 -- DocumentableToPageTranslator --> page1[Pages]\n    subgraph ide2 [ ]\n    page1 -- PageTransformer --> page2[Pages]\n    end\n    page2 -- Renderer --> Output

You can read about what each stage does in Core extension points and Base plugin.

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#allmodulespagegeneration","title":"AllModulesPageGeneration","text":"

AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration.

Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.

"},{"location":"developer_guide/community/slack/","title":"Slack channel","text":"

Dokka has a dedicated #dokka channel in the Kotlin Community Slack, where you can ask questions and chat about using, customizing or contributing to Dokka.

Follow the instructions to get an invite or connect directly.

"},{"location":"developer_guide/plugin-development/introduction/","title":"Introduction to plugin development","text":"

Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box.

Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more.

In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions.

"},{"location":"developer_guide/plugin-development/introduction/#setup","title":"Setup","text":""},{"location":"developer_guide/plugin-development/introduction/#template","title":"Template","text":"

The easiest way to start is to use the convenient Dokka plugin template. It has pre-configured dependencies, publishing and signing of your artifacts.

"},{"location":"developer_guide/plugin-development/introduction/#manual","title":"Manual","text":"

At a bare minimum, a Dokka plugin requires dokka-core as a dependency:

import org.jetbrains.kotlin.gradle.dsl.JvmTarget\nimport org.jetbrains.kotlin.gradle.tasks.KotlinCompile\n\n\nplugins {\n    kotlin(\"jvm\") version \"<kotlin_version>\"\n}\n\ndependencies {\n    compileOnly(\"org.jetbrains.dokka:dokka-core:<dokka_version>\")\n}\n\ntasks.withType<KotlinCompile>().configureEach {\n    compilerOptions.jvmTarget.set(JvmTarget.JVM_1_8)\n}\n

In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services. All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader.

"},{"location":"developer_guide/plugin-development/introduction/#extension-points","title":"Extension points","text":"

Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions.

You can learn how to declare extension points and extensions in Introduction to Extension points.

In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka.

"},{"location":"developer_guide/plugin-development/introduction/#example","title":"Example","text":"

You can follow the sample plugin tutorial, which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation.

For more practical examples, have a look at sources of community plugins.

"},{"location":"developer_guide/plugin-development/introduction/#help","title":"Help","text":"

If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/","title":"Sample plugin tutorial","text":"

We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden.

The plugin will be tested with the following code:

package org.jetbrains.dokka.internal.test\n\nannotation class Internal\n\nfun shouldBeVisible() {}\n\n@Internal\nfun shouldBeExcludedFromDocumentation() {}\n

Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation.

Full source code of this tutorial can be found in Dokka's examples under hide-internal-api.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#preparing-the-project","title":"Preparing the project","text":"

We'll begin by using Dokka plugin template. Press the Use this template button and open this project in IntelliJ IDEA.

First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own.

For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {\n\n}\n

After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin:

org.example.dokka.plugin.HideInternalApiPlugin\n

At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#extending-dokka","title":"Extending Dokka","text":"

After preparing the project we can begin extending Dokka with our own extension.

Having read through Core extensions, it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables.

Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it.

Create a new class, place it next to your plugin and implement the abstract method. You should end up with this:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer\nimport org.jetbrains.dokka.model.Documentable\nimport org.jetbrains.dokka.plugability.DokkaContext\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {}\n\nclass HideInternalApiTransformer(context: DokkaContext) : SuppressedByConditionDocumentableFilterTransformer(context) {\n\n    override fun shouldBeSuppressed(d: Documentable): Boolean {\n        return false\n    }\n}\n

Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has.

To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint.

Having read through Introduction to extensions, we now know how to register our extensions:

class HideInternalApiPlugin : DokkaPlugin() {\n    val myFilterExtension by extending {\n        plugin<DokkaBase>().preMergeDocumentableTransformer providing ::HideInternalApiTransformer\n    }\n}\n

At this point we're ready to debug our plugin locally, it should already work, but do nothing.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#debugging","title":"Debugging","text":"

Please read through Debugging Dokka, it goes over the same steps in more detail and with examples. Below you will find rough instructions.

First, let's begin by publishing our plugin to mavenLocal().

./gradlew publishToMavenLocal\n

This will publish your plugin under the groupId, artifactId and version that you've specified in your build.gradle.kts. In our case it's org.example:hide-internal-api:1.0-SNAPSHOT.

Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies:

dependencies {\n    dokkaPlugin(\"org.example:hide-internal-api:1.0-SNAPSHOT\")\n}\n

Next, in that project let's run dokkaHtml with debug enabled:

./gradlew clean dokkaHtml -Dorg.gradle.debug=true --no-daemon\n

Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug.

If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#implementing-plugin-logic","title":"Implementing plugin logic","text":"

Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property).

Looking at what's inside the object, you might notice it has 3 values in extra, one of which is Annotations. Sounds like something we need!

Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later):

override fun shouldBeSuppressed(d: Documentable): Boolean {\n\n    val annotations: List<Annotations.Annotation> =\n        (d as? WithExtraProperties<*>)\n            ?.extra\n            ?.allOfType<Annotations>()\n            ?.flatMap { it.directAnnotations.values.flatten() }\n            ?: emptyList()\n\n    return annotations.any { isInternalAnnotation(it) }\n}\n\nprivate fun isInternalAnnotation(annotation: Annotations.Annotation): Boolean {\n   return annotation.dri.packageName == \"org.jetbrains.dokka.internal.test\"\n           && annotation.dri.classNames == \"Internal\"\n}\n

Seems like we're done with writing our plugin and can begin testing it manually.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#manual-testing","title":"Manual testing","text":"

At this point, the implementation of your plugin should look roughly like this:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.DokkaBase\nimport org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer\nimport org.jetbrains.dokka.model.Annotations\nimport org.jetbrains.dokka.model.Documentable\nimport org.jetbrains.dokka.model.properties.WithExtraProperties\nimport org.jetbrains.dokka.plugability.DokkaContext\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {\n    val myFilterExtension by extending {\n        plugin<DokkaBase>().preMergeDocumentableTransformer providing ::HideInternalApiTransformer\n    }\n}\n\nclass HideInternalApiTransformer(context: DokkaContext) : SuppressedByConditionDocumentableFilterTransformer(context) {\n\n    override fun shouldBeSuppressed(d: Documentable): Boolean {\n        val annotations: List<Annotations.Annotation> =\n            (d as? WithExtraProperties<*>)\n                ?.extra\n                ?.allOfType<Annotations>()\n                ?.flatMap { it.directAnnotations.values.flatten() }\n                ?: emptyList()\n\n        return annotations.any { isInternalAnnotation(it) }\n    }\n\n    private fun isInternalAnnotation(annotation: Annotations.Annotation): Boolean {\n        return annotation.dri.packageName == \"org.jetbrains.dokka.internal.test\"\n                && annotation.dri.classNames == \"Internal\"\n    }\n}\n

Bump plugin version in gradle.build.kts, publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation.

Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed!

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#unit-testing","title":"Unit testing","text":"

You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it.

We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible.

Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference.

Below you will find a complete unit test that passes, and the main takeaways below that.

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest\nimport kotlin.test.Test\nimport kotlin.test.assertEquals\n\nclass HideInternalApiPluginTest : BaseAbstractTest() {\n\n    @Test\n    fun `should hide annotated functions`() {\n        val configuration = dokkaConfiguration {\n            sourceSets {\n                sourceSet {\n                    sourceRoots = listOf(\"src/main/kotlin/basic/Test.kt\")\n                }\n            }\n        }\n        val hideInternalPlugin = HideInternalApiPlugin()\n\n        testInline(\n            \"\"\"\n            |/src/main/kotlin/basic/Test.kt\n            |package org.jetbrains.dokka.internal.test\n            |\n            |annotation class Internal\n            |\n            |fun shouldBeVisible() {}\n            |\n            |@Internal\n            |fun shouldBeExcludedFromDocumentation() {}\n        \"\"\".trimMargin(),\n            configuration = configuration,\n            pluginOverrides = listOf(hideInternalPlugin)\n        ) {\n            preMergeDocumentablesTransformationStage = { modules ->\n                val testModule = modules.single { it.name == \"root\" }\n                val testPackage = testModule.packages.single { it.name == \"org.jetbrains.dokka.internal.test\" }\n\n                val packageFunctions = testPackage.functions\n                assertEquals(1, packageFunctions.size)\n                assertEquals(\"shouldBeVisible\", packageFunctions[0].name)\n            }\n        }\n    }\n}\n

Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail.

Things to note and remember:

  1. Your test class should extend BaseAbstractTest, which contains base utility methods for testing.
  2. You can configure Dokka to your liking, enable some specific settings, configure source sets, etc. All done via dokkaConfiguration DSL.
  3. testInline function is the main entry point for unit tests
  4. You can pass plugins to be used in a test, notice pluginOverrides parameter
  5. You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage).
  6. You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable, for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that).

Full source code of this tutorial can be found in Dokka's examples under hide-internal-api.

"}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Dokka","text":"

Dokka is an API documentation engine for Kotlin.

If you want to learn how to use Dokka, see documentation on kotlinlang.org.

If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides.

"},{"location":"developer_guide/introduction/","title":"Developer guides","text":"

The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself.

If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow: it will teach you how to build, debug and test Dokka locally.

CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka.

If you want to get into plugin development quick, see Introduction to plugin development.

If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals.

Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it.

If you have any questions, feel free to get in touch with maintainers via Slack or GitHub.

"},{"location":"developer_guide/workflow/","title":"Workflow","text":"

Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do:

  1. How to build Dokka or a plugin
  2. How to use/test locally built Dokka in a project
  3. How to debug Dokka or a plugin in IntelliJ IDEA

We'll go over each step individually in this section.

Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL, but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish.

"},{"location":"developer_guide/workflow/#build-dokka","title":"Build Dokka","text":"

Building Dokka is pretty straightforward:

./gradlew build\n

This will build all subprojects and trigger build in all composite builds. If you are working on a runner, you can build it independently.

Checks that are performed as part of build do not require any special configuration or environment, and should not take much time (~2-5 minutes), so please make sure they pass before submitting a pull request.

"},{"location":"developer_guide/workflow/#troubleshooting-build","title":"Troubleshooting build","text":""},{"location":"developer_guide/workflow/#api-check-failed-for-project","title":"API check failed for project ..","text":"

If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API.

If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational.

"},{"location":"developer_guide/workflow/#use-test-locally-built-dokka","title":"Use / test locally built Dokka","text":"

Having built Dokka locally, you can publish it to mavenLocal(). This will allow you to test your changes in another project as well as debug code remotely.

  1. Publish a custom version of Dokka to Maven Local: ./gradlew publishToMavenLocal -Pversion=1.9.20-my-fix-SNAPSHOT. This version will be propagated to plugins that reside inside Dokka's project (mathjax, kotlin-as-java, etc), and its artifacts should appear in ~/.m2
  2. In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository:
    repositories {\n   mavenLocal()\n}\n
  3. Update your Dokka dependency to the version you've just published:
    plugins {\n    id(\"org.jetbrains.dokka\") version \"1.9.20-my-fix-SNAPSHOT\"\n}\n

After completing these steps, you should be able to build documentation using your own version of Dokka.

"},{"location":"developer_guide/workflow/#debugging-dokka","title":"Debugging Dokka","text":"

Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin.

Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin.

  1. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle, there's both simple single-module and more complex multi-module / multiplatform examples.
  2. For the debug project, set org.gradle.debug to true in one of the following ways:

    • In your gradle.properties add org.gradle.debug=true
    • When running Dokka tasks:./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon
  3. Run the desired Dokka task with --no-daemon. Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon.

  4. Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process

Note

The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again.

If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon

In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds.

"},{"location":"developer_guide/workflow/#run-integration-tests","title":"Run integration tests","text":"

Dokka's integration tests help check compatibility with various versions of Kotlin, Android, Gradle and Java. They apply Dokka to real user-like projects and invoke Gradle / Maven / CLI tasks to generate the documentation.

Integration tests require a significant amount of available RAM (~20-30GB), take 1+ hour and may require additional environment configuration to run. For these reasons, it's not expected that you run all integration tests locally as part of the everyday development process, they will be run on CI once you submit a PR.

However, if you need to run all integration tests locally, you can use the integrationTest task:

./gradlew integrationTest\n

If you need to run a specific test locally, you can run it from your IDE or by calling the corresponding Gradle task (for example, :dokka-integration-tests:gradle:testExternalProjectKotlinxCoroutines).

"},{"location":"developer_guide/architecture/architecture_overview/","title":"Architecture overview","text":"

Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine.

However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded.

For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level.

"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-data-model","title":"Overview of data model","text":"

Generating API documentation begins with input source files (.kt, .java, etc) and ends with some output files (.html/.md, etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model.

Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage.

flowchart TD\n    Input --> Documentables --> Pages --> Output

You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else.

For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on.

For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content

For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions.

"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-extension-points","title":"Overview of extension points","text":"

An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point.

You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API.

Here's a sneak peek of the DSL:

// declare your own plugin\nclass MyPlugin : DokkaPlugin() {\n    // create an extension point for developers to use\n    val signatureProvider by extensionPoint<SignatureProvider>()\n\n    // provide a default implementation\n    val defaultSignatureProvider by extending {\n        signatureProvider with KotlinSignatureProvider()\n    }\n\n    // register our own extension in Dokka's Base plugin by overriding its default implementation\n    val dokkaBasePlugin by lazy { plugin<DokkaBase>() }\n    val multimoduleLocationProvider by extending {\n        (dokkaBasePlugin.locationProviderFactory\n                providing MultimoduleLocationProvider::Factory\n                override dokkaBasePlugin.locationProvider)\n    }\n}\n\nclass MyExtension(val context: DokkaContext) {\n\n    // use an existing extension\n    val signatureProvider: SignatureProvider = context.plugin<MyPlugin>().querySingle { signatureProvider }\n\n    fun doSomething() {\n        signatureProvider.signature(..)\n    }\n}\n\ninterface SignatureProvider {\n    fun signature(documentable: Documentable): List<ContentNode>\n}\n\nclass KotlinSignatureProvider : SignatureProvider {\n    override fun signature(documentable: Documentable): List<ContentNode> = listOf()\n}\n

For a deeper dive into extensions and extension points, see Introduction to Extensions.

For an overview of existing extension points, see Core extension points and Base extensions.

"},{"location":"developer_guide/architecture/architecture_overview/#historical-context","title":"Historical context","text":"

This is a second iteration of Dokka that was built from scratch.

If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity. The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.

"},{"location":"developer_guide/architecture/data_model/documentable_model/","title":"Documentable Model","text":"

The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth.

By default, the documentables are created from:

Code-wise, you can have a look at following classes:

Upon creation, the documentable model represents a collection of trees, each with DModule as root.

Take some arbitrary Kotlin source code that is located within the same module:

// Package 1\nclass Clazz(val property: String) {\n    fun function(parameter: String) {}\n}\n\nfun topLevelFunction() {}\n\n// Package 2\nenum class Enum { }\n\nval topLevelProperty: String\n

This would be represented roughly as the following Documentable tree:

flowchart TD\n    DModule --> firstPackage[DPackage]\n    firstPackage --> DClass\n    firstPackage --> toplevelfunction[DFunction] \n    DClass --> DProperty\n    DClass --> DFunction\n    DFunction --> DParameter\n    DModule --> secondPackage[DPackage]\n    secondPackage --> DEnum\n    secondPackage --> secondPackageProperty[DProperty]

At later stages of transformation, all trees are folded into one by DocumentableMerger.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable","title":"Documentable","text":"

The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction, DPackage, DProperty, and so on.

DClasslike is the base class for all class-like documentables, such as DClass, DEnum, DAnnotation and others.

The contents of each documentable normally represent what you would see in the source code.

For example, if you open DClass, you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific).

Here's an example of a documentable:

data class DClass(\n    val dri: DRI,\n    val name: String,\n    val constructors: List<DFunction>,\n    val functions: List<DFunction>,\n    val properties: List<DProperty>,\n    val classlikes: List<DClasslike>,\n    val sources: SourceSetDependent<DocumentableSource>,\n    val visibility: SourceSetDependent<Visibility>,\n    val companion: DObject?,\n    val generics: List<DTypeParameter>,\n    val supertypes: SourceSetDependent<List<TypeConstructorWithKind>>,\n    val documentation: SourceSetDependent<DocumentationNode>,\n    val expectPresentInSet: DokkaSourceSet?,\n    val modifier: SourceSetDependent<Modifier>,\n    val sourceSets: Set<DokkaSourceSet>,\n    val isExpectActual: Boolean,\n    val extra: PropertyContainer<DClass> = PropertyContainer.empty()\n) : DClasslike(), WithAbstraction, WithCompanion, WithConstructors,\n    WithGenerics, WithSupertypes, WithExtraProperties<DClass>\n

There are three non-documentable classes that are important for this model:

"},{"location":"developer_guide/architecture/data_model/documentable_model/#dri","title":"DRI","text":"

DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable. All references and relations between the documentables (other than direct ownership) are described using DRI.

For example, DFunction with a parameter of type Foo only has Foo's DRI, but not the actual reference to Foo's Documentable object.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#example","title":"Example","text":"

For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines:

package kotlinx.coroutines\n\nimport ...\n\npublic abstract class MainCoroutineDispatcher : CoroutineDispatcher() {\n\n    override fun limitedParallelism(parallelism: Int): CoroutineDispatcher {\n        ...\n    }\n}\n

If we were to re-create the DRI of this function in code, it would look something like this:

DRI(\n    packageName = \"kotlinx.coroutines\",\n    classNames = \"MainCoroutineDispatcher\",\n    callable = Callable(\n        name = \"limitedParallelism\",\n        receiver = null,\n        params = listOf(\n            TypeConstructor(\n                fullyQualifiedName = \"kotlin.Int\",\n                params = emptyList()\n            )\n        )\n    ),\n    target = PointingToDeclaration,\n    extra = null\n)\n

If you format it as String, it would look like this:

kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#sourcesetdependent","title":"SourceSetDependent","text":"

SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets.

This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect, or code comments written for expect might be different from what's written for actual.

Under the hood, it's a typealias to a Map:

typealias SourceSetDependent<T> = Map<DokkaSourceSet, T>\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#extraproperty","title":"ExtraProperty","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

This element is a bit more complex, so you can read more about how to use it in a separate section.

"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentation-model","title":"Documentation model","text":"

The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs).

"},{"location":"developer_guide/architecture/data_model/documentable_model/#doctag","title":"DocTag","text":"

DocTag describes a specific documentation syntax element.

It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and <b>bold</b> in Java.

However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use.

DocTag elements can be deeply nested with other DocTag children elements.

Examples:

data class H1(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class H2(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class Strikethrough(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class Strong(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : DocTag()\n\ndata class CodeBlock(\n    override val children: List<DocTag> = emptyList(),\n    override val params: Map<String, String> = emptyMap()\n) : Code()\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#tagwrapper","title":"TagWrapper","text":"

TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return.

Since each such section may contain formatted text inside it, each TagWrapper has DocTag children.

/**\n * @author **Ben Affleck*\n * @return nothing, except _sometimes_ it may throw an [Error]\n */\nfun foo() {}\n
"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentationnode","title":"DocumentationNode","text":"

DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable, usually used like this:

data class DFunction(\n    ...\n    val documentation: SourceSetDependent<DocumentationNode>,\n    ...\n)\n
"},{"location":"developer_guide/architecture/data_model/extra/","title":"Extra","text":""},{"location":"developer_guide/architecture/data_model/extra/#introduction","title":"Introduction","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

ExtraProperty classes are available both in the Documentable and the Content models.

To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras:

data class CustomExtra(\n    [any data relevant to your extra], \n    [any data relevant to your extra] \n): ExtraProperty<Documentable> {\n    override val key: CustomExtra.Key<Documentable, *> = CustomExtra\n    companion object : CustomExtra.Key<Documentable, CustomExtra>\n}\n

Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets, when the documentables being merged have their own Extra of the same type.

"},{"location":"developer_guide/architecture/data_model/extra/#propertycontainer","title":"PropertyContainer","text":"

All extras for ContentNode and Documentable classes are stored in the PropertyContainer<C : Any> class instances.

data class DFunction(\n    ...\n    override val extra: PropertyContainer<DFunction> = PropertyContainer.empty()\n    ...\n) : WithExtraProperties<DFunction>\n

PropertyContainer has a number of convenient functions for handling extras in a collection-like manner.

The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable.

"},{"location":"developer_guide/architecture/data_model/extra/#usage-example","title":"Usage example","text":"

In following example we will create a DFunction-only extra property, store it and then retrieve its value:

// Extra that is applicable only to DFunction\ndata class CustomExtra(val customExtraValue: String) : ExtraProperty<DFunction> {\n    override val key: ExtraProperty.Key<Documentable, *> = CustomExtra\n    companion object: ExtraProperty.Key<Documentable, CustomExtra>\n}\n\n// Storing it inside the documentable\nfun DFunction.withCustomExtraProperty(data: String): DFunction {\n    return this.copy(\n        extra = extra + CustomExtra(data)\n    )\n}\n\n// Retrieveing it from the documentable\nfun DFunction.getCustomExtraPropertyValue(): String? {\n    return this.extra[CustomExtra]?.customExtraValue\n}\n

You can also use extras as markers, without storing any data in them:

object MarkerExtra : ExtraProperty<Any>, ExtraProperty.Key<Any, MarkerExtra> {\n    override val key: ExtraProperty.Key<Any, *> = this\n}\n\nfun Documentable.markIfFunction(): Documentable {\n    return when(this) {\n        is DFunction -> this.copy(extra = extra + MarkerExtra)\n        else -> this\n    }\n}\n\nfun WithExtraProperties<Documentable>.isMarked(): Boolean {\n    return this.extra[MarkerExtra] != null\n}\n
"},{"location":"developer_guide/architecture/data_model/page_content/","title":"Page / Content Model","text":"

Even though the Page and Content models reside on the same level (under Page), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only.

"},{"location":"developer_guide/architecture/data_model/page_content/#page","title":"Page","text":"

The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file.

The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as (.html, .md, etc), and how, is up to the Renderer extension.

Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage, PackagePage, ClasslikePage, MemberPage and so on.

The Page model can be represented as a tree, with RootPageNode at the root.

Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property:

flowchart TD\n    RootPageNode --> firstPackage[PackagePageNode]\n    RootPageNode --> secondPackage[PackagePageNode]\n    RootPageNode --> thirdPackage[PackagePageNode]\n    firstPackage --> firstPackageFirstMember[MemberPageNode - Function]\n    firstPackage --> firstPackageSecondMember[MemberPageNode - Property]\n    firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class]\n    firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function]\n    firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property]\n    secondPackage --> etcOne[...]\n    thirdPackage --> etcTwo[...]

Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it.

"},{"location":"developer_guide/architecture/data_model/page_content/#content-model","title":"Content Model","text":"

The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal.

For an example, have a look at the subclasses of ContentNode: ContentText, ContentList, ContentTable, ContentCodeBlock, ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style.

// real example of composing content using the `DocumentableContentBuilder` DSL\norderedList {\n    item {\n        text(\"This list contains a nested table:\")\n        table {\n            header {\n                text(\"Col1\")\n                text(\"Col2\")\n            }\n            row {\n                text(\"Text1\")\n                text(\"Text2\")\n            }\n        }\n    }\n    item {\n        group(styles = setOf(TextStyle.Bold)) {\n            text(\"This is bald\")\n            text(\"This is also bald\")\n        }\n    }\n}\n

It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json).

For instance, HtmlRenderer might render ContentCodeBlock as <code>text</code>, but CommonmarkRenderer might render it using backticks.

"},{"location":"developer_guide/architecture/data_model/page_content/#dci","title":"DCI","text":"

Each node is identified by a unique DCI, which stands for Dokka Content Identifier.

DCI aggregates DRIs of all documentables that are used by the given ContentNode.

data class DCI(val dri: Set<DRI>, val kind: Kind)\n

All references to other nodes (other than direct ownership) are described using DCI.

"},{"location":"developer_guide/architecture/data_model/page_content/#contentkind","title":"ContentKind","text":"

ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page.

For example, on the same page that describes a class you can have multiple sections (== ContentKinds). One to describe functions, one to describe properties, another one to describe the constructors, and so on.

"},{"location":"developer_guide/architecture/data_model/page_content/#styles","title":"Styles","text":"

Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way.

group(styles = setOf(TextStyle.Paragraph)) {\n    text(\"Text1\", styles = setOf(TextStyle.Bold))\n    text(\"Text2\", styles = setOf(TextStyle.Italic))\n}\n

It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as <b>text</b>, but CommonmarkRenderer might render it as **text**.

There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box:

// for code highlighting\nenum class TokenStyle : Style {\n    Keyword, Punctuation, Function, Operator, Annotation,\n    Number, String, Boolean, Constant, Builtin, ...\n}\n\nenum class TextStyle : Style {\n    Bold, Italic, Strong, Strikethrough, Paragraph, ...\n}\n\nenum class ContentStyle : Style {\n    TabbedContent, RunnableSample, Wrapped, Indented, ...\n}\n
"},{"location":"developer_guide/architecture/data_model/page_content/#extra","title":"Extra","text":"

ExtraProperty is used to store any additional information that falls outside of the regular model.

It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins.

All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions.

This element is a bit complex, so you can read more about how to use it in a separate section.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/","title":"Base plugin","text":"

DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions, as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats.

If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase, as it reduces the scope of changes you need to make.

DokkaBase is used extensively in Dokka's own output formats.

You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/#extension-points","title":"Extension points","text":"

Some notable extension points defined in Dokka's Base plugin.

"},{"location":"developer_guide/architecture/extension_points/base_plugin/#premergedocumentabletransformer","title":"PreMergeDocumentableTransformer","text":"

PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation.

This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged.

It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal, you most likely need an implementation of PreMergeDocumentableTransformer.

For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/","title":"Core extension points","text":"

Core extension points represent the main stages of generating documentation.

These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka.

For higher-level extension functions that can be used in different output formats, have a look at the Base plugin.

You can find all core extensions in the CoreExtensions class:

object CoreExtensions {\n    val preGenerationCheck by coreExtensionPoint<PreGenerationChecker>()\n    val generation by coreExtensionPoint<Generation>()\n    val sourceToDocumentableTranslator by coreExtensionPoint<SourceToDocumentableTranslator>()\n    val documentableMerger by coreExtensionPoint<DocumentableMerger>()\n    val documentableTransformer by coreExtensionPoint<DocumentableTransformer>()\n    val documentableToPageTranslator by coreExtensionPoint<DocumentableToPageTranslator>()\n    val pageTransformer by coreExtensionPoint<PageTransformer>()\n    val renderer by coreExtensionPoint<Renderer>()\n    val postActions by coreExtensionPoint<PostAction>()\n}\n

On this page, we'll go over each extension point individually.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pregenerationchecker","title":"PreGenerationChecker","text":"

PreGenerationChecker can be used to run some checks and constraints.

For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets, and fails if it finds any.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#generation","title":"Generation","text":"

Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable.

See Generation implementations to learn about the default implementations.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#sourcetodocumentabletranslator","title":"SourceToDocumentableTranslator","text":"

SourceToDocumentableTranslator translates any given sources into the Documentable model.

Kotlin and Java sources are supported by default by the Base plugin, but you can analyze any language as long as you can map it to the Documentable model.

For reference, see

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentablemerger","title":"DocumentableMerger","text":"

DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletransformer","title":"DocumentableTransformer","text":"

DocumentableTransformer performs the same function as PreMergeDocumentableTransformer, but after merging source sets.

Notable example is InheritorsExtractorTransformer, it extracts inheritance information from source sets and creates an inheritance map.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletopagetranslator","title":"DocumentableToPageTranslator","text":"

DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples.

Output formats can either use the same page structure or define their own.

Only a single extension of this type is expected to be registered.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pagetransformer","title":"PageTransformer","text":"

PageTransformer is useful if you need to add, remove or modify generated pages or their content.

Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages.

If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#renderer","title":"Renderer","text":"

Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly.

Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer.

"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#postaction","title":"PostAction","text":"

PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages.

Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.

"},{"location":"developer_guide/architecture/extension_points/extension_points/","title":"Extension points","text":"

In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#declaring-extension-points","title":"Declaring extension points","text":"

If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code.

class MyPlugin : DokkaPlugin() {\n    val sampleExtensionPoint by extensionPoint<SampleExtensionPointInterface>()\n}\n\ninterface SampleExtensionPointInterface {\n    fun doSomething(input: Input): List<Output>\n}\n\nclass Input\nclass Output\n

Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#extending-from-extension-points","title":"Extending from extension points","text":"

You can use extension points to provide your own implementations in order to customize a plugin's behaviour.

If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase), you can use plugin querying API to do that.

The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface.

class MyExtendedPlugin : DokkaPlugin() {\n\n    val mySampleExtensionImplementation by extending {\n        plugin<MyPlugin>().sampleExtensionPoint with SampleExtensionImpl()\n    }\n}\n\nclass SampleExtensionImpl : SampleExtensionPointInterface {\n    override fun doSomething(input: Input): List<Output> = listOf()\n}\n

Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself:

open class MyPlugin : DokkaPlugin() {\n    val sampleExtensionPoint by extensionPoint<SampleExtensionPointInterface>()\n\n    val defaultSampleExtension by extending {\n        sampleExtensionPoint with DefaultSampleExtension()\n    }\n}\n\nclass DefaultSampleExtension : SampleExtensionPointInterface {\n    override fun doSomething(input: Input): List<Output> = listOf()\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#providing","title":"Providing","text":"

If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead.

val defaultSampleExtension by extending {\n    sampleExtensionPoint providing { context ->\n        // can use context to query other extensions or get configuration \n        DefaultSampleExtension() \n    }\n}\n

You can read more on what you can do with context in Obtaining extension instance.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#override","title":"Override","text":"

By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other.

However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        (myPlugin.sampleExtensionPoint\n                with SampleExtensionImpl()\n                override myPlugin.defaultSampleExtension)\n    }\n}\n

This is also useful if you wish to override some extension from DokkaBase, to disable or alter it.

"},{"location":"developer_guide/architecture/extension_points/extension_points/#order","title":"Order","text":"

Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        myPlugin.sampleExtensionPoint with SampleExtensionImpl() order {\n            before(myPlugin.firstExtension)\n            after(myPlugin.thirdExtension)\n        }\n    }\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#conditional-apply","title":"Conditional apply","text":"

If you want your extension to be registered only if some condition is true, you can use the applyIf construct:

class MyExtendedPlugin : DokkaPlugin() {\n    private val myPlugin by lazy { plugin<MyPlugin>() }\n\n    val mySampleExtensionImplementation by extending {\n        myPlugin.sampleExtensionPoint with SampleExtensionImpl() applyIf {\n            Random.Default.nextBoolean()\n        }\n    }\n}\n
"},{"location":"developer_guide/architecture/extension_points/extension_points/#obtaining-extension-instance","title":"Obtaining extension instance","text":"

After an extension point has been created and some extensions have been registered, you can use query and querySingle functions to find all or just a single implementation.

class MyExtension(context: DokkaContext) {\n    // returns all registered extensions for the extension point\n    val allSampleExtensions = context.plugin<MyPlugin>().query { sampleExtensionPoint }\n\n    // will throw an exception if more than one extension is found.\n    // use if you expect only a single extension to be registered for the extension point\n    val singleSampleExtensions = context.plugin<MyPlugin>().querySingle { sampleExtensionPoint }\n\n    fun invoke() {\n        allSampleExtensions.forEach { it.doSomething(Input()) }\n\n        singleSampleExtensions.doSomething(Input())\n    }\n}\n

In order to have access to DokkaContext, you can use the providing keyword when registering an extension.

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/","title":"Generation implementations","text":"

There are two main implementations of the Generation core extension point:

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#singlemodulegeneration","title":"SingleModuleGeneration","text":"

SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish.

Below you can see the flow of how Dokka's data model is transformed by various core and base extensions.

flowchart TD\n    Input -- SourceToDocumentableTranslator --> doc1[Documentables]\n    subgraph documentables [ ]\n    doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables]\n    doc2 -- DocumentableMerger --> doc3[Documentables]\n    doc3 -- DocumentableTransformer --> doc4[Documentables]\n    end\n    doc4 -- DocumentableToPageTranslator --> page1[Pages]\n    subgraph ide2 [ ]\n    page1 -- PageTransformer --> page2[Pages]\n    end\n    page2 -- Renderer --> Output

You can read about what each stage does in Core extension points and Base plugin.

"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#allmodulespagegeneration","title":"AllModulesPageGeneration","text":"

AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration.

Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.

"},{"location":"developer_guide/community/slack/","title":"Slack channel","text":"

Dokka has a dedicated #dokka channel in the Kotlin Community Slack, where you can ask questions and chat about using, customizing or contributing to Dokka.

Follow the instructions to get an invite or connect directly.

"},{"location":"developer_guide/plugin-development/introduction/","title":"Introduction to plugin development","text":"

Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box.

Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more.

In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions.

"},{"location":"developer_guide/plugin-development/introduction/#setup","title":"Setup","text":""},{"location":"developer_guide/plugin-development/introduction/#template","title":"Template","text":"

The easiest way to start is to use the convenient Dokka plugin template. It has pre-configured dependencies, publishing and signing of your artifacts.

"},{"location":"developer_guide/plugin-development/introduction/#manual","title":"Manual","text":"

At a bare minimum, a Dokka plugin requires dokka-core as a dependency:

import org.jetbrains.kotlin.gradle.dsl.JvmTarget\nimport org.jetbrains.kotlin.gradle.tasks.KotlinCompile\n\n\nplugins {\n    kotlin(\"jvm\") version \"<kotlin_version>\"\n}\n\ndependencies {\n    compileOnly(\"org.jetbrains.dokka:dokka-core:<dokka_version>\")\n}\n\ntasks.withType<KotlinCompile>().configureEach {\n    compilerOptions.jvmTarget.set(JvmTarget.JVM_1_8)\n}\n

In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services. All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader.

"},{"location":"developer_guide/plugin-development/introduction/#extension-points","title":"Extension points","text":"

Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions.

You can learn how to declare extension points and extensions in Introduction to Extension points.

In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka.

"},{"location":"developer_guide/plugin-development/introduction/#example","title":"Example","text":"

You can follow the sample plugin tutorial, which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation.

For more practical examples, have a look at sources of community plugins.

"},{"location":"developer_guide/plugin-development/introduction/#help","title":"Help","text":"

If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/","title":"Sample plugin tutorial","text":"

We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden.

The plugin will be tested with the following code:

package org.jetbrains.dokka.internal.test\n\nannotation class Internal\n\nfun shouldBeVisible() {}\n\n@Internal\nfun shouldBeExcludedFromDocumentation() {}\n

Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation.

Full source code of this tutorial can be found in Dokka's examples under hide-internal-api.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#preparing-the-project","title":"Preparing the project","text":"

We'll begin by using Dokka plugin template. Press the Use this template button and open this project in IntelliJ IDEA.

First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own.

For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {\n\n}\n

After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin:

org.example.dokka.plugin.HideInternalApiPlugin\n

At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#extending-dokka","title":"Extending Dokka","text":"

After preparing the project we can begin extending Dokka with our own extension.

Having read through Core extensions, it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables.

Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it.

Create a new class, place it next to your plugin and implement the abstract method. You should end up with this:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer\nimport org.jetbrains.dokka.model.Documentable\nimport org.jetbrains.dokka.plugability.DokkaContext\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {}\n\nclass HideInternalApiTransformer(context: DokkaContext) : SuppressedByConditionDocumentableFilterTransformer(context) {\n\n    override fun shouldBeSuppressed(d: Documentable): Boolean {\n        return false\n    }\n}\n

Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has.

To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint.

Having read through Introduction to extensions, we now know how to register our extensions:

class HideInternalApiPlugin : DokkaPlugin() {\n    val myFilterExtension by extending {\n        plugin<DokkaBase>().preMergeDocumentableTransformer providing ::HideInternalApiTransformer\n    }\n}\n

At this point we're ready to debug our plugin locally, it should already work, but do nothing.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#debugging","title":"Debugging","text":"

Please read through Debugging Dokka, it goes over the same steps in more detail and with examples. Below you will find rough instructions.

First, let's begin by publishing our plugin to mavenLocal().

./gradlew publishToMavenLocal\n

This will publish your plugin under the groupId, artifactId and version that you've specified in your build.gradle.kts. In our case it's org.example:hide-internal-api:1.0-SNAPSHOT.

Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies:

dependencies {\n    dokkaPlugin(\"org.example:hide-internal-api:1.0-SNAPSHOT\")\n}\n

Next, in that project let's run dokkaHtml with debug enabled:

./gradlew clean dokkaHtml -Dorg.gradle.debug=true --no-daemon\n

Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug.

If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#implementing-plugin-logic","title":"Implementing plugin logic","text":"

Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property).

Looking at what's inside the object, you might notice it has 3 values in extra, one of which is Annotations. Sounds like something we need!

Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later):

override fun shouldBeSuppressed(d: Documentable): Boolean {\n\n    val annotations: List<Annotations.Annotation> =\n        (d as? WithExtraProperties<*>)\n            ?.extra\n            ?.allOfType<Annotations>()\n            ?.flatMap { it.directAnnotations.values.flatten() }\n            ?: emptyList()\n\n    return annotations.any { isInternalAnnotation(it) }\n}\n\nprivate fun isInternalAnnotation(annotation: Annotations.Annotation): Boolean {\n   return annotation.dri.packageName == \"org.jetbrains.dokka.internal.test\"\n           && annotation.dri.classNames == \"Internal\"\n}\n

Seems like we're done with writing our plugin and can begin testing it manually.

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#manual-testing","title":"Manual testing","text":"

At this point, the implementation of your plugin should look roughly like this:

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.DokkaBase\nimport org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer\nimport org.jetbrains.dokka.model.Annotations\nimport org.jetbrains.dokka.model.Documentable\nimport org.jetbrains.dokka.model.properties.WithExtraProperties\nimport org.jetbrains.dokka.plugability.DokkaContext\nimport org.jetbrains.dokka.plugability.DokkaPlugin\n\nclass HideInternalApiPlugin : DokkaPlugin() {\n    val myFilterExtension by extending {\n        plugin<DokkaBase>().preMergeDocumentableTransformer providing ::HideInternalApiTransformer\n    }\n}\n\nclass HideInternalApiTransformer(context: DokkaContext) : SuppressedByConditionDocumentableFilterTransformer(context) {\n\n    override fun shouldBeSuppressed(d: Documentable): Boolean {\n        val annotations: List<Annotations.Annotation> =\n            (d as? WithExtraProperties<*>)\n                ?.extra\n                ?.allOfType<Annotations>()\n                ?.flatMap { it.directAnnotations.values.flatten() }\n                ?: emptyList()\n\n        return annotations.any { isInternalAnnotation(it) }\n    }\n\n    private fun isInternalAnnotation(annotation: Annotations.Annotation): Boolean {\n        return annotation.dri.packageName == \"org.jetbrains.dokka.internal.test\"\n                && annotation.dri.classNames == \"Internal\"\n    }\n}\n

Bump plugin version in gradle.build.kts, publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation.

Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed!

"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#unit-testing","title":"Unit testing","text":"

You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it.

We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible.

Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference.

Below you will find a complete unit test that passes, and the main takeaways below that.

package org.example.dokka.plugin\n\nimport org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest\nimport kotlin.test.Test\nimport kotlin.test.assertEquals\n\nclass HideInternalApiPluginTest : BaseAbstractTest() {\n\n    @Test\n    fun `should hide annotated functions`() {\n        val configuration = dokkaConfiguration {\n            sourceSets {\n                sourceSet {\n                    sourceRoots = listOf(\"src/main/kotlin/basic/Test.kt\")\n                }\n            }\n        }\n        val hideInternalPlugin = HideInternalApiPlugin()\n\n        testInline(\n            \"\"\"\n            |/src/main/kotlin/basic/Test.kt\n            |package org.jetbrains.dokka.internal.test\n            |\n            |annotation class Internal\n            |\n            |fun shouldBeVisible() {}\n            |\n            |@Internal\n            |fun shouldBeExcludedFromDocumentation() {}\n        \"\"\".trimMargin(),\n            configuration = configuration,\n            pluginOverrides = listOf(hideInternalPlugin)\n        ) {\n            preMergeDocumentablesTransformationStage = { modules ->\n                val testModule = modules.single { it.name == \"root\" }\n                val testPackage = testModule.packages.single { it.name == \"org.jetbrains.dokka.internal.test\" }\n\n                val packageFunctions = testPackage.functions\n                assertEquals(1, packageFunctions.size)\n                assertEquals(\"shouldBeVisible\", packageFunctions[0].name)\n            }\n        }\n    }\n}\n

Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail.

Things to note and remember:

  1. Your test class should extend BaseAbstractTest, which contains base utility methods for testing.
  2. You can configure Dokka to your liking, enable some specific settings, configure source sets, etc. All done via dokkaConfiguration DSL.
  3. testInline function is the main entry point for unit tests
  4. You can pass plugins to be used in a test, notice pluginOverrides parameter
  5. You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage).
  6. You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable, for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that).

Full source code of this tutorial can be found in Dokka's examples under hide-internal-api.

"}]} \ No newline at end of file diff --git a/2.0.20-SNAPSHOT/sitemap.xml b/2.0.20-SNAPSHOT/sitemap.xml index 30f2202b91..f12ed3dedd 100644 --- a/2.0.20-SNAPSHOT/sitemap.xml +++ b/2.0.20-SNAPSHOT/sitemap.xml @@ -2,72 +2,72 @@ https://github.com/Kotlin/dokka/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/introduction/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/workflow/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/architecture_overview/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/data_model/documentable_model/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/data_model/extra/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/data_model/page_content/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/extension_points/base_plugin/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/extension_points/core_extension_points/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/extension_points/extension_points/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/architecture/extension_points/generation_implementations/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/community/slack/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/plugin-development/introduction/ - 2024-07-05 + 2024-08-02 daily https://github.com/Kotlin/dokka/developer_guide/plugin-development/sample-plugin-tutorial/ - 2024-07-05 + 2024-08-02 daily \ No newline at end of file diff --git a/2.0.20-SNAPSHOT/sitemap.xml.gz b/2.0.20-SNAPSHOT/sitemap.xml.gz index 5acbe7700d148e238b0c31d64e8d95c890b93067..d489f4a7ea49a63d81cdea973cb2d96f4d0fd26f 100644 GIT binary patch literal 390 zcmV;10eSu(iwFqS|Ey*L|8r?{Wo=<_E_iKh0NqxxZi6ro-SZWQ9ZaJRX+_A^rE5DQ z$G#W~RDB!6F%pej{6b*SWcfqOik-m{Oh>)mNE*%6%AXt()8S>+j6fl^zw&EMZ% zicP+2R&{X317(qH{WkB2#%IQ+X)3)Y82IQ*bc%V=#lW)#WL5c;H>)g;Il=YqGHQdO znX~wqMrrE8z7#mHS0tk?li2s1)ofaT zn{&->kgFQ@CFd`~Bpkuu2(DE@OJLfPLm33cHB3kKgswjroYGOuo#V#M>goQmZO1wyC!Kb)zTN4zpOpW$gRo?;6C0EphjCjbBd literal 392 zcmV;30eAi%iwFqJxQAu}|8r?{Wo=<_E_iKh0NqwSZ-X!p-TNyLJD5hT`XNHLE?wIh zIrarCjE~yqko^1F393@HQ-?~97r4js={@^6yWSi|o$bLog=U@KmQ|jC5h#`E*7^I} zOYxX*nw#2N?SQgCHg28wMAI{4^E{VM5sZ9tCEAWT(TBjX1!PtEls7k77%{>5?KEnQ zqKR2tNYg3m{k9Z1vR5Rn43jtvoYgF{E+=tEBg$q~t?r8Ip{VWyWx+0JqUWZAorRA^ za;1+oyFkur*q5BY2$Qe}jT6|m^GX6!hAc`iD6C;Rs%NyrPUD=8Vs1qrDoc>J--8xI z1@9H4>P2$mZBPU{u!SGN5Z`K8pyWNlfh6{l1sV$5lc<}p6JTPiZzl|0%dstY z&`#QS%3O;jfuSM%K(EBywfgG7faUQ3ZPhqwIJLkSCgd;>MsE}ynbTYh=?Fjn|HT5w mrBC!q^NSJ3V>uPe7xIKw@qai|PmgeAZax86f>(TE3jhE$SiR@~