Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/dev' into dev-ext
Browse files Browse the repository at this point in the history
  • Loading branch information
Dolu1990 committed Feb 4, 2025
2 parents 74d7ab2 + 5b09686 commit 25382fa
Show file tree
Hide file tree
Showing 36 changed files with 915 additions and 488 deletions.
34 changes: 34 additions & 0 deletions .github/workflows/push-docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
name: "Push scaladoc on gh-pages"

on:
push:
branches:
- dev
release:
types:
- published

jobs:
push-docs:
runs-on: ubuntu-22.04
timeout-minutes: 30

steps:
- name: Checkout
uses: actions/checkout@v4
with:
submodules: recursive
- name: Setup JDK
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 17
cache: sbt
- name: Build scaladoc
run: sbt clean doc
- name: Deploy 🚀
uses: JamesIves/github-pages-deploy-action@v4
with:
folder: target/scala-2.12/api
target-folder: ${{ github.ref_name }}
single-commit: true
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ Here is the online documentation :
- https://spinalhdl.github.io/VexiiRiscv-RTD/master/VexiiRiscv/Introduction/#
- https://spinalhdl.github.io/VexiiRiscv-RTD/master/VexiiRiscv/HowToUse/index.html

Here is the VexiiRiscv's scala doc (auto-generated from the source code) :

- https://spinalhdl.github.io/VexiiRiscv/doc/vexiiriscv/index.html

A roadmap is available here :

- https://github.com/SpinalHDL/VexiiRiscv/issues/1
Expand Down
23 changes: 19 additions & 4 deletions doc/litex/alpine/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,17 @@
https://blog.ari.lt/b/how-to-manually-install-alpine-linux-on-any-linux-distribution/
https://wiki.gentoo.org/wiki/OpenRC_to_systemd_Cheatsheet
# Getting Alpine linux to run on RISC-V

So this README compile a set of command lines which were used to get it up on running on the VexiiRiscv RV64GC litex SoC.
You can find most of the fundamental informations on the links bellow :

- https://blog.ari.lt/b/how-to-manually-install-alpine-linux-on-any-linux-distribution/
- https://wiki.gentoo.org/wiki/OpenRC_to_systemd_Cheatsheet

Overall, if you have a system which can run Debian, then it should be able to run Alpine without issues.
The one tricky thing with alpine linux is that the image you can download generaly need a few tweeks using chroot to be functional.

## Image setup

```bash
cd /media/rawrr/rootfs
sudo rm -rf *
sudo tar xpvf /media/data2/download/alpine-minirootfs-3.20.1-riscv64.tar.gz --xattrs-include='*.*' --numeric-owner
Expand All @@ -15,18 +26,22 @@ rc-update add loadkeys boot
rc-update add chronyd boot

apk add xf86-video-fbdev xterm
```

setup-keymap ch fr
## Scrap commands

```bash
setup-keymap ch fr
XKBMODEL="pc105"
XKBLAYOUT="ch"
XKBVARIANT="fr"
XKBOPTIONS=""

mount -o remount,rw /

kbd-bkeymaps

loadkeys ch-fr

apk add chocolate-doom --repository=https://dl-cdn.alpinelinux.org/alpine/edge/testing

```
5 changes: 5 additions & 0 deletions doc/litex/buildroot/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@

This readme document (from scratch) how to generate all the images to run buildroot Litex + VexiiRiscv.

The hardware requirements are to have :
- RV32IMA CPU
- Some RAM (64 MB)
- SDCARD support

## Setup environnement variables

```shell
Expand Down
5 changes: 5 additions & 0 deletions doc/litex/debian/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@

This readme document (from scratch) how to generate all the images to run debian Litex + VexiiRiscv.

The hardware requirements are to have :
- RV64IMAFDC CPU
- A good amount of RAM (128 MB)
- SDCARD support

## Setup environnement variables

```shell
Expand Down
2 changes: 1 addition & 1 deletion ext/NaxSoftware
2 changes: 1 addition & 1 deletion ext/riscv-isa-sim
2 changes: 1 addition & 1 deletion ext/rvls
Submodule rvls updated 2 files
+11 −11 README.md
+3 −4 src/hart.cpp
6 changes: 6 additions & 0 deletions src/main/scala/vexiiriscv/Generate.scala
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,11 @@ object Generate extends App {
val sc = SpinalConfig()
val regions = ArrayBuffer[PmaRegion]()
val analysis = new AnalysisUtils
var reportModel = false

assert(new scopt.OptionParser[Unit]("VexiiRiscv") {
help("help").text("prints this usage text")
opt[Unit]("report-model") action { (v, c) => reportModel = true }
param.addOptions(this)
analysis.addOption(this)
ParamSimple.addOptionRegion(this, regions)
Expand All @@ -42,6 +44,10 @@ object Generate extends App {
}

analysis.report(report)

if(reportModel){
misc.Reporter.model(report.toplevel)
}
}

//Generates a tilelink version of VexiiRiscv verilog using command line arguments
Expand Down
33 changes: 17 additions & 16 deletions src/main/scala/vexiiriscv/Param.scala
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,7 @@ class ParamSimple(){
var allowBypassFrom = 100 //100 => disabled
var additionalPerformanceCounters = 0
var withPerformanceCounters = false
var withPerformanceScountovf = true // Disabled to keep in sync with RVLS
var fetchL1Enable = false
var fetchL1Sets = 64
var fetchL1Ways = 1
Expand Down Expand Up @@ -357,17 +358,16 @@ class ParamSimple(){
}


// Define a few utilities to mutate the ParamSimple
def withRvm(): Unit = {
withMul = true
withDiv = true
}

def withBranchPredicton(): Unit = {
withBtb = true
withGShare = true
withRas = true
}

def withCaches(): Unit = {
fetchL1Enable = true
fetchL1Sets = 64
Expand All @@ -379,13 +379,11 @@ class ParamSimple(){

withLsuBypass = true
}

def withLinux(): Unit = {
privParam.withSupervisor = true
privParam.withUser = true;
withMmu = true
}

def withMmuSyncRead(): Unit = {
fetchTsp = MmuStorageParameter(
levels = List(
Expand Down Expand Up @@ -434,6 +432,7 @@ class ParamSimple(){
)
}

// Hash code used the regression test to generate a unique workspace folder per config
override def hashCode() = {
var hash = 0
val md = new StringBuilder()
Expand All @@ -453,7 +452,7 @@ class ParamSimple(){
Math.abs(md.toString.hashCode())
}


// Generate a human redable name from most of the supported configuration
def getName() : String = {
def opt(that : Boolean, v : String) = that.mux(v, "")
var isa = s"rv${xlen}i"
Expand Down Expand Up @@ -492,6 +491,7 @@ class ParamSimple(){
r.mkString("_")
}

// Initialize a scopt commande line arguement parser to take controle of this SimpleParam
def addOptions(parser: scopt.OptionParser[Unit]) = {
import parser._
opt[Int]("xlen") action { (v, c) => xlen = v }
Expand Down Expand Up @@ -547,6 +547,7 @@ class ParamSimple(){
opt[Unit]("regfile-infer-ports") action { (v, c) => regFileDualPortRam = false }
opt[Int]("allow-bypass-from") action { (v, c) => allowBypassFrom = v }
opt[Int]("performance-counters") unbounded() action { (v, c) => withPerformanceCounters = true; additionalPerformanceCounters = v }
opt[Unit]("without-performance-scountovf") unbounded() action { (v, c) => withPerformanceScountovf = false }
opt[Unit]("with-fetch-l1") unbounded() action { (v, c) => fetchL1Enable = true }
opt[Unit]("with-lsu-l1") action { (v, c) => lsuL1Enable = true }
opt[Unit]("fetch-l1") action { (v, c) => fetchL1Enable = true }
Expand Down Expand Up @@ -587,6 +588,7 @@ class ParamSimple(){
opt[Int]("pmp-size") action { (v, c) => pmpParam.pmpSize = v }
opt[Int]("pmp-granularity") action { (v, c) => pmpParam.granularity = v }
opt[Unit]("pmp-tor-disable") action { (v, c) => pmpParam.withTor = false }
opt[Unit]("with-rdtime") action { (v, c) => privParam.withRdTime = true }
opt[Unit]("with-cfu") action { (v, c) => withCfu = true }
opt[Unit]("dual-issue") action { (v, c) =>
decoders = 2
Expand Down Expand Up @@ -616,6 +618,7 @@ class ParamSimple(){
}
}

// Generate the VexiiRiscv plugin list out of the current SimpleParam configuration
def plugins(hartId : Int = 0) = pluginsArea(hartId).plugins
def pluginsArea(hartId : Int = 0) = new Area {
val plugins = ArrayBuffer[Hostable]()
Expand All @@ -637,6 +640,7 @@ class ParamSimple(){
plugins += new misc.PipelineBuilderPlugin()
plugins += new schedule.ReschedulePlugin()

// Branch prediction
plugins += new LearnPlugin()
if(withRas) assert(withBtb)
if(withGShare) assert(withBtb)
Expand All @@ -652,10 +656,6 @@ class ParamSimple(){
jumpAt = 1+relaxedBtb.toInt,
bootMemClear = bootMemClear
)
// plugins += new prediction.DecodePredictionPlugin(
// decodeAt = decoderAt,
// jumpAt = decoderAt
// )
}
if(withGShare) {
plugins += new prediction.GSharePlugin (
Expand All @@ -672,6 +672,7 @@ class ParamSimple(){
}


// Fetch
plugins += new fetch.PcPlugin(resetVector)
plugins += new fetch.FetchPipelinePlugin()
if(!fetchL1Enable) plugins += new fetch.FetchCachelessPlugin(
Expand Down Expand Up @@ -718,7 +719,6 @@ class ParamSimple(){
}
}
}

plugins += new decode.DecodePipelinePlugin()
plugins += new decode.AlignerPlugin(
fetchAt = alignerPluginFetchAt,
Expand Down Expand Up @@ -751,13 +751,13 @@ class ParamSimple(){
trapAt = 0+regFileSync.toInt + 1 + intWritebackAt,
withBypasses = allowBypassFrom == 0
)

plugins += new execute.ExecutePipelinePlugin()

val lane0 = newExecuteLanePlugin("lane0")

// Main execution pipeline
val early0 = new LaneLayer("early0", lane0, priority = 0)
plugins += lane0

plugins += new SrcPlugin(early0, executeAt = 0, relaxedRs = relaxedSrc)
plugins += new IntAluPlugin(early0, formatAt = 0)
plugins += shifter(early0, formatAt = relaxedShift.toInt)
Expand Down Expand Up @@ -899,7 +899,8 @@ class ParamSimple(){
noTapCd = embeddedJtagNoTapCd
)
val lateAluAt = intWritebackAt


// Late ALU in the main execution pipeline
if(withLateAlu) {
val late0 = new LaneLayer("late0", lane0, priority = -5)
plugins += new SrcPlugin(late0, executeAt = lateAluAt, relaxedRs = relaxedSrc)
Expand All @@ -911,6 +912,7 @@ class ParamSimple(){

plugins += new WriteBackPlugin(lane0, IntRegFile, writeAt = withLateAlu.mux(lateAluAt, intWritebackAt), allowBypassFrom = allowBypassFrom)

// Second execution pipeline (dual-issue configs)
if(lanes >= 2) {
val lane1 = newExecuteLanePlugin("lane1")
val early1 = new LaneLayer("early1", lane1, priority = 10)
Expand All @@ -923,6 +925,7 @@ class ParamSimple(){
plugins += new BranchPlugin(early1, aluAt = 0, jumpAt = relaxedBranch.toInt, wbAt = 0)
if(withRvZb) plugins ++= ZbPlugin.make(early1, formatAt=0)

// Late ALU in the Second execution pipeline
if(withLateAlu) {
val late1 = new LaneLayer("late1", lane1, priority = -3)
plugins += new SrcPlugin(late1, executeAt = lateAluAt, relaxedRs = relaxedSrc)
Expand All @@ -942,6 +945,7 @@ class ParamSimple(){
case _ =>
}

// FPU
if (withRvf || withRvd) {
plugins += new regfile.RegFilePlugin(
spec = riscv.FloatRegFile,
Expand All @@ -951,8 +955,6 @@ class ParamSimple(){
dualPortRam = regFileDualPortRam,
maskReadDuringWrite = false
)

// plugins += new execute.fpu.FpuExecute(early0, 0)
plugins += new WriteBackPlugin(lane0, FloatRegFile, writeAt = 9, allowBypassFrom = allowBypassFrom.max(2)) //Max 2 to save area on not so important instructions
plugins += new execute.fpu.FpuFlagsWritebackPlugin(lane0, pipTo = intWritebackAt)
plugins += new execute.fpu.FpuCsrPlugin(List(lane0), intWritebackAt)
Expand All @@ -968,7 +970,6 @@ class ParamSimple(){
if(withRvd) plugins += new execute.fpu.FpuXxPlugin(early0)
plugins += new execute.fpu.FpuDivPlugin(early0)
plugins += new execute.fpu.FpuPackerPlugin(lane0, ignoreSubnormal = fpuIgnoreSubnormal)
// plugins += new execute.fpu.FpuEmbedded()
}

plugins += new WhiteboxerPlugin(
Expand Down
14 changes: 13 additions & 1 deletion src/main/scala/vexiiriscv/decode/Service.scala
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,10 @@ import scala.collection.mutable.ArrayBuffer

class DecodingCtx(val node : NodeBaseApi, val legal : Bool)

/**
* Provide an API which allows other plugins to ask additional instruction decoding in the decode pipeline,
* providing decoded values in the DecodePipeline payloads
*/
trait DecoderService {
val elaborationLock = Retainer()
val decodingLock = Retainer()
Expand All @@ -23,14 +27,22 @@ trait DecoderService {
def addDecodingLogic(body : DecodingCtx => Unit)
}


/**
* Provide an API which allows other plugin to carry pipeline payload from Fetch to Decode.
* The payload carried can be specified to come from the first or the last fetch-word of a given instruction.
* This is used by plugins like branch prediction to carry data through the different pipelines
*/
trait AlignerService{
val lastSliceData, firstSliceData = mutable.LinkedHashSet[NamedType[_ <: Data]]()
val elaborationLock = Retainer()
def addLastSliceDataCtx(that : NamedType[_ <: Data]) = lastSliceData += that
def addFirstSliceDataCtx(that : NamedType[_ <: Data]) = firstSliceData += that
}

/**
* Provide an API which allows to inject an instruction in the CPU pipeline.
* This is used by the PrivilegedPlugin to implement the RISC-V External Debug Support spec.
*/
trait InjectorService {
val injectRetainer = Retainer()
var injectPorts = ArrayBuffer[Flow[Bits]]()
Expand Down
2 changes: 1 addition & 1 deletion src/main/scala/vexiiriscv/execute/BranchPlugin.scala
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ import vexiiriscv.fetch.{Fetch, PcPlugin}
import vexiiriscv.memory.{AddressTranslationPortUsage, AddressTranslationService}
import vexiiriscv.misc.{PerformanceCounterService, TrapService}
import vexiiriscv.prediction.Prediction.BRANCH_HISTORY_WIDTH
import vexiiriscv.prediction.{FetchWordPrediction, HistoryPlugin, HistoryUser, LearnCmd, LearnService, LearnSource, Prediction}
import vexiiriscv.prediction.{FetchWordPrediction, HistoryPlugin, LearnCmd, LearnService, LearnSource, Prediction}
import vexiiriscv.schedule.{DispatchPlugin, ReschedulePlugin}

import scala.collection.mutable
Expand Down
Loading

0 comments on commit 25382fa

Please sign in to comment.