Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - sociallender

Pages: 1 [2] 3 4 ... 19
16
Peer Lending Server / Re: List detection and then stops?
« on: January 03, 2016, 06:49:23 PM »
Not sure what the problem is but will try to help.  I assume that between when PLS starts and loan list detection begins, PLS is able to obtain your initial cash value.  If so, this means that PLS is able to connect to LC via their API.  Please check to make sure this is correct first. 

Also, what happens if you run the service manually?  You can do this by clicking the blue and white start icon on the control panel.  Can you post what the log file shows? 

17
Peer Lending Server / Re: Model score
« on: November 09, 2015, 09:09:44 AM »
The model is based on a gradient boosted machine algo which is an ensemble of decision trees.  I haven't done a good job of explaining the model but will try to document it when I can find the time soon.  The model is probably 6 months old and due for an update. Probably get around to pushing a new model before beginning of new year.  Let me know if you have any specific questions...

SL



18
Peer Lending Server / Re: Filter using borrowers employment years
« on: November 09, 2015, 09:02:41 AM »
The employment data provided by LC are distinct values (not continuous).  In other words, possible values are only 0,12,24,36,48,60,72,84,96 and 108.  If you want to include all loans >=24, then just select all values 24 and greater in the select box. 

However, if you wanted to create your own custom filter you could use the following in the customer filter input box:  empLength >= 24.  This would add the same logic as just selecting described above.

Hope that helps,
SL


19
Peer Lending Server / Re: Enhancement : Naming filters
« on: October 09, 2015, 03:04:53 PM »
Sorry, you can only have 1 filter in the new version.

SL

20
Peer Lending Server / Re: Two Quick Questions regarding PLS behavior
« on: September 21, 2015, 01:35:09 PM »
1)  There is no price markdown or discount of notes.  I am not sure  exactly what you mean by this question, but the dollar amount for the "Investment Amount per Note" field is the amount invested in each new note regardless of any other field.  The "Minimum Cash Level" field essentially disables investing until your account exceeds the minimum cash level + investment amount per note.  It basically allows you to keep a certain level of cash in your account.

2)  PLS does not sell notes.   Nor does it purchase on foliofn.  It will only add new notes to your portfolio.

Hope that answers your questions,
SL

21
Peer Lending Server / Re: After Reading Your Blog
« on: September 15, 2015, 02:28:11 PM »
PLS does not implement EFP.  Sorry.

22
Peer Lending Server / Re: Oracle On Windows 10
« on: September 10, 2015, 11:19:28 PM »
Glad its working!  Thanks for the update.

SL

23
Peer Lending Server / Re: New User, Uh Oh
« on: September 09, 2015, 09:28:28 PM »
Sorry for the late reply.  Have you been able to use PLS after funding the account more than $25.  It may be a rounding issue in the code but I haven't dug deep yet.  I will probably add this fix in an updated release sometime in the future. 

SL

24
Peer Lending Server / Re: Oracle On Windows 10
« on: September 09, 2015, 09:26:49 PM »
I had another user with VB on win10 and this is how he fixed it:

"i didnt realize the oracle vm would be setting up another adapter.  with bitdefender the auto/default settings are never correct...in case you have anyone else that runs into this issue, the settings that seem to be working for me are:

network type:  home/office
stealth mode:  off
generic:       on "

Although, I am not sure if this will help in your case.  I found this on the VB forum that says 5.0.4 is not supported on win10:

https://forums.virtualbox.org/viewtopic.php?f=6&t=70054&p=334877&hilit=windows+10#p334877

Perhaps try an older version as a test...

Looks like you are not alone having network issues with VB on win10 as well:

https://forums.virtualbox.org/viewtopic.php?f=6&t=68590&p=334845&hilit=windows+10+network#p334845
https://forums.virtualbox.org/viewtopic.php?f=6&t=66682&p=334708&hilit=windows+10+network#p334708

Just found a link to someone reporting that it may work on a test build release:

https://www.virtualbox.org/wiki/Testbuilds

Let me know how it goes...
SL


25
Peer Lending Server / Re: API endpoint?
« on: August 02, 2015, 11:19:48 AM »
Sorry, there is no API currently available.  The software is GUI driven.  If you know R, you may be able to roll your own?  The IRR is given in the historical analysis page based on the criteria provided.  It uses historical payment history (cash flow) per loan to calculate the IRR for historical loans.  The process is described here:

http://peerlendingserver.com/uncategorized/1310/

Hope that helps.
SL


26
Peer Lending Server / Re: Just starting out, have some questions:
« on: July 04, 2015, 02:20:58 PM »
Thats correct.  LC has been consistent recently with new notes list times.  However, sometimes it can be either early or late.  Best to start one minute early to load the system and prepare.  It basically sends continuous API requests until new notes appear. 

Also, cron, the linux scheduler, has  a 60 second frequency.  It was just easier to set it up by minutes than to add in seconds programatically. 

The clock should be OK as it has NTP running (as long as no firewall issues).  However, don't know if the host OS is way off time, how much it will affect the PLS instance. 

SL
 


27
Peer Lending Server / Re: Just starting out, have some questions:
« on: July 04, 2015, 09:24:55 AM »
1) It should stop with an error that you do not have sufficient cash to meet your minimum investment (at minimum $25).  Since you don't have any funds, you won't be able to order any notes.

2) Yes, you can run 2 or more instances.  However, I don't recommend it as you will be running list detection and downloading the notes twice.  I would copy both filters (given at bottom of filter page) and join them together with logical OR.  Something like (term==36 & grade=='A') | (term==60 & grade=='A').  I haven't tested this filter but should resemble it with logical groupings.  You then manually put this as your filter in the custom tab.

3) I am not sure.  Try peercube, nickelsteamroller or LR.  They are pretty good stat sites. 

SL

28
Peer Lending Server / Re: Just starting out, have some questions:
« on: July 03, 2015, 06:21:54 PM »
1)  I am not sure what a resisted loan is.  Do you mean a relisted loan? PLS can only filter by what is available in the API listings: https://www.lendingclub.com/developers/listed-loans.action.  I don't see that data field available. So, I dont think it's possible.
2) PLS will only invest in new loans recently issued.  It will never invest in loans from previous lists as the software will only download loans from the most recent list.  This prevents reinvestment in the same loan (unless you run the software multiple times in between list and the note is not already fully funded.  By default, PLS only runs at list and not between.  However, you can configure PLS to run as many times as you like.).  So, just to be clear, via the API, PLS will download only new notes from the most recent list (this is a feature of LCs API).  The other benefit here is that the all the notes don't need to be downloaded which requires bandwidth and extra time.  Just downloading new notes is very fast (which is important with competition these days)
3) If you want to do historical analysis only on completed notes (not current), click that box and it will only show you notes which have been completed.  This is only used for historical analysis purposes.
4) Fractional or Whole.  Notes can be listed on the whole market before being sent to the fractional market.  PLS works in the fractional market ($25 increments).
5)   Yes, your filter appears to be pretty tight. You should expect to see around .19% of new notes match your criteria.  For example, if 500 new notes are listed in a given day, you may have 1 match your criteria (.0019 x 500).  However, there is no guarantee.  I would try starting with your most general filter criteria (for example term 36 months), then add the remaining filters one at a time and watch the filtered notes percentage to see which is the most restrictive and perhaps modify your filter given that knowledge.

SL

29
Investors - LC / Re: Wierd goings on at LC
« on: June 30, 2015, 09:30:07 AM »
Can someone explain to me what is going on.  From the previous posts, it looks like there is a surplus of loans available currently (over 4K notes).  However, now new loans being added during listing time is very small?  Is this correct?  I have a threshold that detects when new notes have been listed on the platform (15 notes more than previously listed new notes).  However, the system hasn't detected new notes recently.  I am assuming because the number of new notes being added are small being that most of them have already been listed? 

30
Investors - P / Re: Prosper API calls using cURL
« on: May 07, 2015, 10:28:46 PM »
LC doesn't provide historical data through API. Actually LC is worse as there is no consistency on the download URL and when LC will create a new file instead of updating the old files. In last 3 years I have seen them changing access at least six time. I wouldn't even go into LC keep changing fields in historical download. Even the latest download few days ago changed and removed fields.

RESTful API specs doesn't have the limit but underlying services and infrastructure layers do including on client side. The timeout settings on both server and client interfere with large HTTP Response.

The Prosper Historical Listing dataset is very very large (Over 350,000 records at last count). Every entry has 545 attributes. 350,000 x 545 with Prosper versus 500,000 x 40 or so with LC.


Just for the record, there is no limit to an HTTP request size that I am aware of (RESTful API).  If any are present, it is usually self imposed.  Also, I am not sure that the size of the data set would be 2+GB given proper compression (but dont' know since I have never seen the full data set).  The procedure you use above to download the data may work, but to anyone just trying to download the data, it seems a bit arduous. 

To be fair to Prosper, I can understand why they would want to break up the data into chunks.  Perhaps in the API, they could make different endpoints similar to the way LC gives separate links to their historicals based on date range.

Never said that LC uses an API to provide historical data (I am VERY familiar with LC interface).  However, they do break up the data so that it is not one large set to download (reasonable enough).  My suggestion (of many) is to perhaps do the same at Prosper via their API.  Being they do not offer flat files anymore, they would have to implement it somehow in the API (not my first choice but doable).

Most web servers by default (such as apache) do not have a limitation of GET size requests (you can however set a limit).  PHP/Apache typically have multiple POST (not relevant but informative) limit parameters that can be adjusted.  HTTP clients such as browsers do have inherent limitations.  However, many developers use tools  to bypass and set the appropriate parameters to allow greater size transfers if required.  If you do a verbose dump of the connection with prosper, you will notice that no payload gets transferred prior to the reset.  Typically, this means that the data has not been prepared in time before the timeout/connection reset (timeout presumably imposed on their end).  With large HTTP transfers, you will see the payload while it is being sent over the wire(depending on verbosity).  In this case, it appears that the data never gets a chance off the starting line before a resource limit has been reached (again probably self imposed or worst case a parameter that has not been properly adjusted).

Also, you would be surprised at compression ratios of sparse data (such as loan information).  Again, I don't think that the file should be 2GB+ even with 550 variables compressed (many of the data points are null anyways).  But again, I don't know.  What is the file size of the historical data set that you have collected after merging all of the chunks together?  Perhaps you can run a compress on it and tell us what ratio you are receiving.

I think it is clear that their API is less than ideal for historicals.  If an API is not the right transport protocol, then by all means use a flat file (both of which are HTTP based).  Make it easy to access their data and you will have more developers joining the party.  Forcing cumbersome work arounds is unnecessary in today's day age.  Not responding to support requests is terrible.  Not updating your web site to address these issues is bad.  And in your own words, not focusing on the retail investor makes me not even want to participate.

 


Pages: 1 [2] 3 4 ... 19