You can't do the "limit 1" at the outer level, you loose what you are looking for... ALL devices last entry. Use a pre-query for the last ID of each device, then join back select Devices. *, DataLog.
Time_Stamp, DataLog. FuelLevel, DataLog. Voltage from ( select DeviceID, max( ID ) LastActionID from DataLog group by 1 ) LastInstance join DataLog on LastInstance.
LastActionID = DataLog. ID join Devices on LastInstance. DeviceID = Devices.
DeviceID order by Devices. DeviceName Per your last comment, I would actually change to something like Update your device table with a "LastLogID". Then, via a trigger an insert into your DataLog table, update the Device table immediately with that new ID... This way, you never need to pre-query the data log directly.. You'll already HAVE the last ID and run from that directly to the data log joined by that ID.
You can't do the "limit 1" at the outer level, you loose what you are looking for... ALL devices last entry. Use a pre-query for the last ID of each device, then join back... select Devices. *, DataLog.
Time_Stamp, DataLog. FuelLevel, DataLog. Voltage from ( select DeviceID, max( ID ) LastActionID from DataLog group by 1 ) LastInstance join DataLog on LastInstance.
LastActionID = DataLog. ID join Devices on LastInstance. DeviceID = Devices.
DeviceID order by Devices. DeviceName Per your last comment, I would actually change to something like... Update your device table with a "LastLogID". Then, via a trigger an insert into your DataLog table, update the Device table immediately with that new ID... This way, you never need to pre-query the data log directly.. You'll already HAVE the last ID and run from that directly to the data log joined by that ID.
This also works, thank you very much @DRapp – Johan Simpson Apr 12 at 14:02 1 @Johan Simpson, if the activity you are dealing with probes quite frequently (minutes or smaller intervals), and the tables get huge, you might want to add a where clause in the inner DataLog Group By to WHERE the DATE is greater than ... say 3 days ago... so it doesn't keep requerying the ENTIRE data log file – DRapp Apr 12 at 14:16 Yes this is a good idea, since the datalog log equivalent table on my production machines have nearly 1million rows. – Johan Simpson Apr 12 at 14:29 1 @Johan Simpson, revised answer... recommended adjustments. – DRapp Apr 12 at 14:35 Yes I will be doing exactly this, thanks!
– Johan Simpson Apr 12 at 15:06.
I know it's horrible, not elegant and time consuming, but this query works: SELECT deviceId,deviceName,deviceDescription, (SELECT time_stamp FROM datalog WHERE datalog. DeviceId=devices. DeviceId ORDER BY time_stamp DESC LIMIT 0,1) time_stamp, (SELECT FuelLevel FROM datalog WHERE datalog.
DeviceId=devices. DeviceId ORDER BY time_stamp DESC LIMIT 0,1) FuelLevel, (SELECT Voltage FROM datalog WHERE datalog. DeviceId=devices.
DeviceId ORDER BY time_stamp DESC LIMIT 0,1) Voltage FROM devices I tried to have a single subquery retrieving multiple columns, but MySql complains because it wants only one column.
This works thank you very much! – Johan Simpson Apr 12 at 14:01 @DRapp 's solution might be less resource intensive if im correct, do you agree @Marco? – Johan Simpson Apr 12 at 14:03 1 @Johan Simpson: Yes, it is – Marco Apr 12 at 14:06.
Try by the way if you want only latest row then you can search it by auto increment field (datalog_table. Id) SELECT dvc. DeviceId,dvc.
DeviceName,dvc. DeviceDescription, dtl. Time_stamp,dtl.
FuelLevel,dtl. Voltage FROM device_table dvc INNER JOIN datalog_table dtl ON dtl. DeviceId=dvc.
DeviceId ORDER BY dtl. Id LIMIT 1.
This only works on the assumption that the DataLog table is ordered chronologically. – Andy Apr 12 at 12:08 I assume that id is primary key( auto increment) of Devices Table – diEcho Apr 12 at 12:11 1 this only returns 1 of the device's data. – Johan Simpson Apr 12 at 12:32 it doesn't work – Marco Apr 12 at 13:30.
SELECT d. DeviceId, d. DeviceName, d.
DeviceDescription, dl. Time_stamp, dl. FuelLevel, dl.
Voltage FROM Device d, DataLog dl WHERE d. DeviceId=dl. DeviceID ORDER BY time_stamp DESC LIMIT 1.
It won't work as you expect it to. – Andy Apr 12 at 12:09 1 this returns the data for device 2, it should return a row for each device with latest data joined. – Johan Simpson Apr 12 at 12:34 it doesn't work – Marco Apr 12 at 13:29.
What I would like to do is select a specific set of rows from one table (table A) and join with another table (table B), such that only one record will appear from table A, joined with the most recent record from table B, based on a datetime column. So you can see that table B has been joined with table A on B. Fk_A = A.id, but only the most recent corresponding record from B has been included in the results.
...with the records from table A repeated. Obviously my SQL-fu is just not good enough for this task, so I would be most grateful if one of you kind people could point me in the right direction. I have done quite a bit of Googling and searching around SO and I have not found anything that matches this specific task, although I am sure the question has been asked before - I suspect there is an SQL keyword that I am forgetting/unaware of and if I searched for that I would find the answer instantly.
I think this question deals with the same problem although I am not 100% sure and the accepted answer involves SELECT TOP, which I thought (?) was not valid in MySQL. ...where notes contains the "many records" and leads contains the "one record" they relate to. It should also be noted that potentially I would also want to return the oldest record (in a different query) but I imagine this will be a simple case of inverting an ASC/DESC somewhere, or something similarly easy.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.