🦸Summary view
Visualizes your validator performance in multiple time frames
Last updated
Visualizes your validator performance in multiple time frames
Last updated
The summary table is part of the validator dashboard. It provides detailed performance metrics for each validator group, including efficiency and validator rewards across various time frames. It also features a historical chart view.
Each row in the summary table represents a validator group. Users with multiple groups will see an additional row above the thin orange line, labeled as the Σ-row.
Expanding a row provides detailed data for each group.
The summary table can be viewed in both absolute and relative values. This feature is especially useful for large entities with millions of attestations that prefer a relative value.
Summary table with relative values
Summary table with absolute values
The total rewards are divided by the number of hours in the selected period and then by the 32 ETH stake per validator to calculate the base hourly return per validator. This hourly rate is then scaled up to an annual percentage.
APR = ((RewardsInPeriod / HoursInPeriod) / (32 * ValidatorCount)) * (24 * 365) * 100
The All time period currently shows the 90-day APR. This is subject to change in the future.
Validators are randomly selected for duties, such as proposing blocks or participating in the sync committee. The chance of being assigned duties depends on the total number of validators on the network and their effective balance.
Luck values are purely informative and cannot be actively influenced.
Blocks
It is calculated how likely a validator was to be chosen to propose a block in each epoch.
The number of blocks a validator proposed is measured against the expected number, expressed as a percentage.
In practice, this calculation is performed for all validators over the selected time frame.
Sync Committee
It is calculated how likely it was for a validator to be chosen as a member of the sync committee in each election.
The number of actual participations is compared to the expected participations and expressed as a percentage.
In practice, this calculation is performed for all validators over the selected time frame.
[1]
View the historical performance of your validator groups by clicking this button and switching to the summary chart view.\
[2]
The summary chart view is perfect for debugging issues, as it provides historical performance data with epoch-level granularity. The Groups dropdown lets users filter specific groups, while the Total line (orange line) shows the average performance of all groups in the dashboard.
To identify missed blocks, sync duties, or attestations, users can use the Efficiency dropdown to filter for those specific duties.
The status column offers a quick way to check key duties, such as your current sync committees, whether you're part of the upcoming sync committee, and if any of your validators have been slashed. Hover over the icons for more details.
The validator column displays the state of each validator in each group:
Online validators (green icon)
Offline validators (red icon)
Exited validators (gray icon)
A comma-separated list of validator indices for each group can be found in the popout modal for each row in the table.
The relative value only accounts for online and offline validators, ignoring those that have already exited the chain. This offers quick insights into whether your group is experiencing issues.
The Validator Efficiency metric is a new, comprehensive measure of validator performance, combining multiple components:
Red arrow: Underperforming the network average by at least 0.25%.
Green arrow: Outperforming the network average by at least 0.25%.
Yellow arrow: Within a range of +/- 0.25% of the network average.
Hovering over an arrow displays the exact performance of each group. For historical performance, switch to the summary chart view.
Attester Efficiency: This measures how much reward a validator could have earned for attestations versus how much they actually earned. For example, if a validator could have earned 100 GWEI for attestations but only received 98 GWEI, the attester efficiency would be 98%.
Attestations (%-value): This represents the percentage of attestations successfully included in the network. For instance, if a validator managed to include 99 out of 100 attestations, the Attestation (%) value would be 99%.
The absolute values can be viewed by hovering over this percentage or by switching to the absolute-table view.