Ask Your Question

search query by ListingID

asked 2018-11-19 00:52:59 -0600

I am trying to create a query using phrets to download records by L_ListingID


    $results = $rets->Search('Property', 'RU_2', '*', ['Limit' => 1, 'Select' => "224934467"]);

I am getting return error "invalid Select" This is a test query - i am trying to write the query to select up to 100 L_ListingID's - can you point me in the right direction?

edit retag flag offensive close merge delete


here is the solution

$results = $rets->Search('Property', 'RU_2', 'L_ListingID=224934467', ['Limit' => 1]);

does the property class need to be in the query ?

rtoutant gravatar imagertoutant ( 2018-11-19 01:09:41 -0600 )edit

What RETS Version are you using?
RETS/1.8 and up allow you to send an empty query
Normally you would want to do incremental searches by date: (L_UpdateDate=2018-11-18+)
To pull a list where you know the IDs use query: (ListingID=123,124,128) to pull 3 listings by ID.
Select is the comma separated list of field names you want to return if you include it.

bwolven gravatar imagebwolven ( 2018-11-19 08:38:46 -0600 )edit

Were you able to get this working?

bwolven gravatar imagebwolven ( 2018-11-20 13:01:37 -0600 )edit

I think what i am trying to doing is this. Download just the basics



compare the input dates to determine which listings require updates.

download up to 200 or so id's at a time to update. ( any more than 250 and i get a curl error )

the question i have now is involving pictures. is there a query to just download a particular index

rtoutant gravatar imagertoutant ( 2018-11-20 13:24:01 -0600 )edit

I can pull much more than 200 with PHRETS, but I had to change some settings:
ini_set("default_socket_timeout", 6000); // seconds
You can use Offset and limit to pull data in batches too.

bwolven gravatar imagebwolven ( 2018-11-20 13:33:09 -0600 )edit

interesting - i am using xampp for testing - i will check the time out and try again - previously i was downloading all records in batches of 1500 - no errors--- by downloading specific ID's the query length is very large - i believe that to be part of the curl error - i will try again and make note of the error

rtoutant gravatar imagertoutant ( 2018-11-20 13:54:08 -0600 )edit

I get Curl Error 27 - when i try 500 ID's CURLE_OUT_OF_MEMORY

rtoutant gravatar imagertoutant ( 2018-11-20 13:57:41 -0600 )edit

I set the memory real high but you may need to user lower values.
But I would suggest using Offset and Limit functionality instead of big lists of IDs.
The other option is to use POST instead of GET.
$config->setOption('use_post_method', true); // boolean
Also I'm using PHRETS 2.0.

bwolven gravatar imagebwolven ( 2018-11-20 14:12:13 -0600 )edit

Were you able to get this working better?

bwolven gravatar imagebwolven ( 2018-11-29 09:24:20 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2018-11-29 13:30:10 -0600

omg - yes - Thank you for your help.

I am just finishing up the new code and even though i download 100 id's at a time - it is just downloading the updates which is speeding everything up and reducing total downloads : )

Thanks again for your assistance in this matter.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2018-11-19 00:52:59 -0600

Seen: 581 times

Last updated: Nov 29 '18