50k articles
200 indexed fields?? Will you search in all those fields? Or you just want to store 200 fields and search through 5-10?
10.2KB is the size of a record? The sum of the averages of each field?
What's the size you will use? For multilanguage, I use 1024 dimensions, which are floats (32b per dimension)
So, given the size of a record multiplied by the average size of a indexed field multiplied by the number of indexed fields multiplied by 2-3 will give you an aproximate size in memory of the collection.
Then, the sweetspot for a big typesense cluster is between 4/8 cores and 32gb/64gb of ram given my experience. Multiplied by 3 for the HA and by the number of tiers will give you the estimated cost. Also, you can calculate the number of tiers or clusters needed using the previous memory size value.