Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure most of links in docs are correct #1808

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions datafusion-expr/src/expr.rs
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ pub enum Expr {
IsNull(Box<Expr>),
/// arithmetic negation of an expression, the operand must be of a signed numeric data type
Negative(Box<Expr>),
/// Returns the field of a [`ListArray`] or [`StructArray`] by key
/// Returns the field of a [`arrow::array::ListArray`] or [`arrow::array::StructArray`] by key
GetIndexedField {
/// the expression to take the field from
expr: Box<Expr>,
Expand Down Expand Up @@ -248,7 +248,7 @@ impl PartialOrd for Expr {
}

impl Expr {
/// Returns the name of this expression based on [crate::logical_plan::DFSchema].
/// Returns the name of this expression based on [datafusion_common::DFSchema].
///
/// This represents how a column with this expression is named when no alias is chosen
pub fn name(&self, input_schema: &DFSchema) -> Result<String> {
Expand Down
6 changes: 3 additions & 3 deletions datafusion-expr/src/signature.rs
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,15 @@ use arrow::datatypes::DataType;
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Clone, Copy, Hash)]
pub enum Volatility {
/// Immutable - An immutable function will always return the same output when given the same
/// input. An example of this is [BuiltinScalarFunction::Cos].
/// input. An example of this is [super::BuiltinScalarFunction::Cos].
Immutable,
/// Stable - A stable function may return different values given the same input across different
/// queries but must return the same value for a given input within a query. An example of
/// this is [BuiltinScalarFunction::Now].
/// this is [super::BuiltinScalarFunction::Now].
Stable,
/// Volatile - A volatile function may change the return value from evaluation to evaluation.
/// Multiple invocations of a volatile function may return different results when used in the
/// same query. An example of this is [BuiltinScalarFunction::Random].
/// same query. An example of this is [super::BuiltinScalarFunction::Random].
Volatile,
}

Expand Down
6 changes: 3 additions & 3 deletions datafusion/src/catalog/information_schema.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

//! Implements the SQL [Information Schema] for DataFusion.
//!
//! Information Schema](https://en.wikipedia.org/wiki/Information_schema)
//! Information Schema]<https://en.wikipedia.org/wiki/Information_schema>

use std::{
any,
Expand Down Expand Up @@ -195,7 +195,7 @@ impl SchemaProvider for InformationSchemaProvider {

/// Builds the `information_schema.TABLE` table row by row
///
/// Columns are based on https://www.postgresql.org/docs/current/infoschema-columns.html
/// Columns are based on <https://www.postgresql.org/docs/current/infoschema-columns.html>
struct InformationSchemaTablesBuilder {
catalog_names: StringBuilder,
schema_names: StringBuilder,
Expand Down Expand Up @@ -276,7 +276,7 @@ impl From<InformationSchemaTablesBuilder> for MemTable {

/// Builds the `information_schema.COLUMNS` table row by row
///
/// Columns are based on https://www.postgresql.org/docs/current/infoschema-columns.html
/// Columns are based on <https://www.postgresql.org/docs/current/infoschema-columns.html>
struct InformationSchemaColumnsBuilder {
catalog_names: StringBuilder,
schema_names: StringBuilder,
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/datasource/object_store/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ pub struct FileMeta {
/// The last modification time of the file according to the
/// object store metadata. This information might be used by
/// catalog systems like Delta Lake for time travel (see
/// https://github.com/delta-io/delta/issues/192)
/// <https://github.com/delta-io/delta/issues/192>)
pub last_modified: Option<DateTime<Utc>>,
}

Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@
//! * Projection: [`ProjectionExec`](physical_plan::projection::ProjectionExec)
//! * Filter: [`FilterExec`](physical_plan::filter::FilterExec)
//! * Hash and Grouped aggregations: [`HashAggregateExec`](physical_plan::hash_aggregate::HashAggregateExec)
//! * Sort: [`SortExec`](physical_plan::sort::SortExec)
//! * Sort: [`SortExec`](physical_plan::sorts::sort::SortExec)
//! * Coalesce partitions: [`CoalescePartitionsExec`](physical_plan::coalesce_partitions::CoalescePartitionsExec)
//! * Limit: [`LocalLimitExec`](physical_plan::limit::LocalLimitExec) and [`GlobalLimitExec`](physical_plan::limit::GlobalLimitExec)
//! * Scan a CSV: [`CsvExec`](physical_plan::file_format::CsvExec)
Expand Down
4 changes: 2 additions & 2 deletions datafusion/src/logical_plan/expr.rs
Original file line number Diff line number Diff line change
Expand Up @@ -144,14 +144,14 @@ pub fn combine_filters(filters: &[Expr]) -> Option<Expr> {
///
/// For example, it rewrites:
///
/// ```ignore
/// ```text
/// .aggregate(vec![col("c1")], vec![sum(col("c2"))])?
/// .project(vec![col("c1"), sum(col("c2"))?
/// ```
///
/// Into:
///
/// ```ignore
/// ```text
/// .aggregate(vec![col("c1")], vec![sum(col("c2"))])?
/// .project(vec![col("c1"), col("SUM(#c2)")?
/// ```
Expand Down
6 changes: 3 additions & 3 deletions datafusion/src/logical_plan/expr_rewriter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ use std::sync::Arc;
pub enum RewriteRecursion {
/// Continue rewrite / visit this expression.
Continue,
/// Call [mutate()] immediately and return.
/// Call [ExprRewriter::mutate()] immediately and return.
Mutate,
/// Do not rewrite / visit the children of this expression.
Stop,
Expand Down Expand Up @@ -339,13 +339,13 @@ fn rewrite_sort_col_by_aggs(expr: Expr, plan: &LogicalPlan) -> Result<Expr> {
}
}

/// Recursively call [`Column::normalize`] on all Column expressions
/// Recursively call [`Column::normalize_with_schemas`] on all Column expressions
/// in the `expr` expression tree.
pub fn normalize_col(expr: Expr, plan: &LogicalPlan) -> Result<Expr> {
normalize_col_with_schemas(expr, &plan.all_schemas(), &plan.using_columns()?)
}

/// Recursively call [`Column::normalize`] on all Column expressions
/// Recursively call [`Column::normalize_with_schemas`] on all Column expressions
/// in the `expr` expression tree.
fn normalize_col_with_schemas(
expr: Expr,
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/optimizer/simplify_expressions.rs
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ pub struct ConstEvaluator<'a> {
/// non evaluatable (e.g. had a column reference or volatile
/// function)
///
/// Specifically, can_evaluate[N] represents the state of
/// Specifically, `can_evaluate[N]` represents the state of
/// traversal when we are N levels deep in the tree, one entry for
/// this Expr and each of its parents.
///
Expand Down
6 changes: 3 additions & 3 deletions datafusion/src/physical_optimizer/pruning.rs
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
//! nothing in the row group can match.
//!
//! This code is currently specific to Parquet, but soon (TM), via
//! https://github.com/apache/arrow-datafusion/issues/363 it will
//! <https://github.com/apache/arrow-datafusion/issues/363> it will
//! be genericized.

use std::convert::TryFrom;
Expand All @@ -47,7 +47,7 @@ use crate::{
physical_plan::{ColumnarValue, PhysicalExpr},
};

/// Interface to pass statistics information to [`PruningPredicates`]
/// Interface to pass statistics information to [`PruningPredicate`]
///
/// Returns statistics for containers / files of data in Arrays.
///
Expand Down Expand Up @@ -88,7 +88,7 @@ pub trait PruningStatistics {
/// Evaluates filter expressions on statistics in order to
/// prune data containers (e.g. parquet row group)
///
/// See [`try_new`] for more information.
/// See [`PruningPredicate::try_new`] for more information.
#[derive(Debug, Clone)]
pub struct PruningPredicate {
/// The input schema against which the predicate will be evaluated
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/hyperloglog/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
//! `hyperloglog` is a module that contains a modified version
//! of [redis's implementation](https://github.com/redis/redis/blob/4930d19e70c391750479951022e207e19111eb55/src/hyperloglog.c)
//! with some modification based on strong assumption of usage
//! within datafusion, so that [`approx_distinct`] function can
//! within datafusion, so that [`crate::logical_plan::approx_distinct`] function can
//! be efficiently implemented.
//!
//! Specifically, like Redis's version, this HLL structure uses
Expand Down
6 changes: 3 additions & 3 deletions datafusion/src/physical_plan/metrics/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ pub use tracker::MemTrackingMetrics;
pub use value::{Count, Gauge, MetricValue, ScopedTimerGuard, Time, Timestamp};

/// Something that tracks a value of interest (metric) of a DataFusion
/// [`ExecutionPlan`] execution.
/// [`super::ExecutionPlan`] execution.
///
/// Typically [`Metric`]s are not created directly, but instead
/// are created using [`MetricBuilder`] or methods on
Expand Down Expand Up @@ -319,7 +319,7 @@ impl Display for MetricsSet {
/// A set of [`Metric`] for an individual "operator" (e.g. `&dyn
/// ExecutionPlan`).
///
/// This structure is intended as a convenience for [`ExecutionPlan`]
/// This structure is intended as a convenience for [`super::ExecutionPlan`]
/// implementations so they can generate different streams for multiple
/// partitions but easily report them together.
///
Expand Down Expand Up @@ -358,7 +358,7 @@ impl ExecutionPlanMetricsSet {
/// "tags" in
/// [InfluxDB](https://docs.influxdata.com/influxdb/v1.8/write_protocols/line_protocol_tutorial/)
/// , "attributes" in [open
/// telemetry](https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/metrics/datamodel.md],
/// telemetry]<https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/metrics/datamodel.md>,
/// etc.
///
/// As the name and value are expected to mostly be constant strings,
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/metrics/tracker.rs
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ impl MemTrackingMetrics {

/// Record that some number of rows have been produced as output
///
/// See the [`RecordOutput`] for conveniently recording record
/// See the [`super::RecordOutput`] for conveniently recording record
/// batch output for other thing
pub fn record_output(&self, num_rows: usize) {
self.metrics.record_output(num_rows)
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/metrics/value.rs
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,7 @@ impl<'a> Drop for ScopedTimerGuard<'a> {
}
}

/// Possible values for a [`Metric`].
/// Possible values for a [super::Metric].
///
/// Among other differences, the metric types have different ways to
/// logically interpret their underlying values and some metrics are
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -538,7 +538,7 @@ pub trait WindowExpr: Send + Sync + Debug {
}

/// expressions that are passed to the WindowAccumulator.
/// Functions which take a single input argument, such as `sum`, return a single [`Expr`],
/// Functions which take a single input argument, such as `sum`, return a single [`datafusion_expr::expr::Expr`],
/// others (e.g. `cov`) return many.
fn expressions(&self) -> Vec<Arc<dyn PhysicalExpr>>;

Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/tdigest/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
//! quantile calculations.
//!
//! The TDigest code in this module is modified from
//! https://github.com/MnO2/t-digest, itself a rust reimplementation of
//! <https://github.com/MnO2/t-digest>, itself a rust reimplementation of
//! [Facebook's Folly TDigest] implementation.
//!
//! Alterations include reduction of runtime heap allocations, broader type
Expand Down
2 changes: 1 addition & 1 deletion datafusion/src/physical_plan/window_functions.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
//! Window functions provide the ability to perform calculations across
//! sets of rows that are related to the current query row.
//!
//! see also https://www.postgresql.org/docs/current/functions-window.html
//! see also <https://www.postgresql.org/docs/current/functions-window.html>

use crate::error::{DataFusionError, Result};
use crate::physical_plan::functions::{TypeSignature, Volatility};
Expand Down
6 changes: 3 additions & 3 deletions datafusion/src/sql/planner.rs
Original file line number Diff line number Diff line change
Expand Up @@ -698,7 +698,7 @@ impl<'a, S: ContextProvider> SqlToRel<'a, S> {
}

/// Generate a logic plan from selection clause, the function contain optimization for cross join to inner join
/// Related PR: https://github.com/apache/arrow-datafusion/pull/1566
/// Related PR: <https://github.com/apache/arrow-datafusion/pull/1566>
fn plan_selection(
&self,
select: &Select,
Expand Down Expand Up @@ -2084,11 +2084,11 @@ fn remove_join_expressions(
/// Filters matching this pattern are added to `accum`
/// Filters that don't match this pattern are added to `accum_filter`
/// Examples:
///
/// ```text
/// foo = bar => accum=[(foo, bar)] accum_filter=[]
/// foo = bar AND bar = baz => accum=[(foo, bar), (bar, baz)] accum_filter=[]
/// foo = bar AND baz > 1 => accum=[(foo, bar)] accum_filter=[baz > 1]
///
/// ```
fn extract_join_keys(
expr: &Expr,
accum: &mut Vec<(Column, Column)>,
Expand Down