I am by no means a statistician, but 2 years ago when I first learned about LC and decided I had to get involved as a lender, I spent many hours learning how to work with Excel Pivot Tables because that was the closest tool I had available. I came up with the following formula, and will do my best to explain as easily as possible. I'd love some feedback from the mathematically inclined community.

Example Scenario

**1000 total loans with the following dumbed down criteria:**Ownership = Home / Rent

Purpose = Debt Consolidation / CC Refi

Our loans have an overall 5% total default rate (so 50 defaults). Default breakdown:

Our 1000 loans based on Ownership:

Renters = 30 defaults out of 500 loans = 6% default rate, therefore = 1.2x average of 5%

Owners = 20 defaults out of 500 loans = 4% default rate, therefore = 0.8x average of 5%

Our 1000 loans based on Purpose:

Debt Consolidation = 35 defaults out of 560 loans = 6.25% default rate, therefore = 1.25x average of 5%

CC Refi = 15 defaults out of 440 loans = 3.41% default rate, therefore = 0.682x average of 5%

**So, can I conclude?**Two new loans: A Renter pursuing a CC Refi (vs) A home owner for debt consolidation, can be calculated as follows

Renter => 1.2 x 0.682 = 0.8184 x 5% = 4.092% default risk

Owner => 0.8 x 1.25 = 1 x 5% = 5% default risk

Is this math anywhere near the ballpark of sane statistical analysis?!

PS. Thank you to anyone who managed to read (and understand) all of that