Virtually every SEO tool is powered by scraped search results. In order to get more insight from each scrape—and thus keep costs down for end users—tools generally use this parameter to force extended search results on the first page, instead of the typically 10 regular organic results a user would see by default. Moz Pro, for example, has for a long time standardized on &num=50 – so 5 “pages” of results per scrape.
This parameter had actually been deprecated for many years, but continued to be unofficially supported. In mid-September, it started to slowly stop working, forcing SEO tools to seek alternative methods. Some tools–including Moz and STAT–prepared an alternative we call “stitching”, where we piece together a series of paginated results, 10 at a time, into one longer set of results. There are various difficulties with this approach, many of which can be mitigated or avoided, but the main implication is cost, which ends up being significantly higher, to the point of being unsustainable in many cases.
This should also be seen in the context of SERP data costs generally increasing in recent years, as tools are forced to mimic real browsers more and more closely in order to get accurate, representative rankings.