Controlling rate at which a simulation runs on clients using Photon PUN?

edited November 2019 in Any Topic & Chat
Any help or ideas appreciated!! :) So I've created a simple 1v1 RTS game that runs using a 'Simulation' (authoritative, player-input accepting networked object using precise server time for sync'd game timers) using Photon PUN. The game has a considerably low unit count, 200 max per session (100 each player). There is no unit micro, the only inputs are building placement or things like upgrades. Very slim to none player input outside of the base really, imagine a tower defense where units just go forward upon creation if that helps at all (it's not a tower defense game though). I know if one client's machine performance (FPS) slows down too much this can cause issues with the networked session. What is the best way to set up a variable that can be streamed by the server or something, which will slow down the rate at which the game runs on both clients? I understand the concept of 'Game Frames' (or fluctuating FixedUpdate rates), updating all game logic in one method x times/sec and using diagnostic stopwatches to send FPS info from each player, but would clients just count game frames per sec and stream it to the server, which would send back new desired simulation FPS? How is this achieved in Quantum's 'Time Dilation', and how can I create a similar, simple solution in Photon PUN? Would precise number of Game Frames run on both clients need to be counted and checksummed by the server, slowing or stopping the game if straying by x number of frames? It is easy to show (Waiting for player...) when dropped or late RPCs due to ping are being sorted through my simulation's Pending > Confirmed > Processed actions, but I just have no clue how to manage fluctuations in machine performances right now using Photon PUN.