First time here? Check out the FAQ!
0

search query by ListingID

I am trying to create a query using phrets to download records by L_ListingID

(PHRETS)

    $results = $rets->Search('Property', 'RU_2', '*', ['Limit' => 1, 'Select' => "224934467"]);

I am getting return error "invalid Select" This is a test query - i am trying to write the query to select up to 100 L_ListingID's - can you point me in the right direction?

rtoutant's avatar
rtoutant
asked 2018-11-19 00:52:59 -0500
edit flag offensive 0 remove flag close merge delete

Comments

here is the solution

$results = $rets->Search('Property', 'RU_2', 'L_ListingID=224934467', ['Limit' => 1]);

does the property class need to be in the query ?

rtoutant's avatar rtoutant (2018-11-19 01:09:41 -0500) edit

What RETS Version are you using?
RETS/1.8 and up allow you to send an empty query
Normally you would want to do incremental searches by date: (L_UpdateDate=2018-11-18+)
To pull a list where you know the IDs use query: (ListingID=123,124,128) to pull 3 listings by ID.
Select is the comma separated list of field names you want to return if you include it.

bwolven's avatar bwolven (2018-11-19 08:38:46 -0500) edit

Were you able to get this working?

bwolven's avatar bwolven (2018-11-20 13:01:37 -0500) edit

I think so...so what i am trying to doing is this. Download just the basics

query="(L_InputDate=2016-01-01T00:00:00+),(L_Status=|1_0,2_0,2_1,2_2,3_0,4_0,5_0,5_1,6_0)";

select="L_Status,L_ListingID,L_InputDate,L_Update

compare the input dates to determine which listings require updates.

download up to 200 or so id's at a time to update. ( any more than 250 and i get a curl error )

the question i have now is involving pictures. is there a query to just download a particular index

rtoutant's avatar rtoutant (2018-11-20 13:24:01 -0500) edit

I can pull much more than 200 with PHRETS, but I had to change some settings:
ini_set('memory_limit','16384M');
ini_set("default_socket_timeout", 6000); // seconds
You can use Offset and limit to pull data in batches too.

bwolven's avatar bwolven (2018-11-20 13:33:09 -0500) edit

interesting - i am using xampp for testing - i will check the time out and try again - previously i was downloading all records in batches of 1500 - no errors--- by downloading specific ID's the query length is very large - i believe that to be part of the curl error - i will try again and make note of the error

rtoutant's avatar rtoutant (2018-11-20 13:54:08 -0500) edit

I get Curl Error 27 - when i try 500 ID's CURLE_OUT_OF_MEMORY

rtoutant's avatar rtoutant (2018-11-20 13:57:41 -0500) edit

I set the memory real high but you may need to user lower values.
But I would suggest using Offset and Limit functionality instead of big lists of IDs.
The other option is to use POST instead of GET.
$config->setOption('use_post_method', true); // boolean
Also I'm using PHRETS 2.0.

bwolven's avatar bwolven (2018-11-20 14:12:13 -0500) edit

Were you able to get this working better?

bwolven's avatar bwolven (2018-11-29 09:24:20 -0500) edit
add a comment see more comments

1 Answer

0

omg - yes - Thank you for your help.

I am just finishing up the new code and even though i download 100 id's at a time - it is just downloading the updates which is speeding everything up and reducing total downloads : )

Thanks again for your assistance in this matter.

rtoutant's avatar
rtoutant
answered 2018-11-29 13:30:10 -0500
edit flag offensive 0 remove flag delete link

Comments

add a comment see more comments

Your Answer

Login/Signup to Answer