Proxmark3 community

Research, development and trades concerning the powerful Proxmark3 device.

Remember; sharing is caring. Bring something back to the community.


"Learn the tools of the trade the hard way." +Fravia

You are not logged in.

Announcement

Time changes and with it the technology
Proxmark3 @ discord

Users of this forum, please be aware that information stored on this site is not private.

#1 2016-07-22 13:08:15

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Some Source enhancements (basicly master token creation)

As I told in the introduction section I recently fiddled around with the Legic sources for a short time (thank you for your welcome messages http://www.proxmark.org/forum/viewtopic.php?id=3419). Since I know the Legic system quit well I started to fix few things.
I tried to generate master tokens from blank media, but this failed couz of the lack to properly write the DCF field. So I started to merge a lot of code posted around here in this Forum and saw it still don’t work, so I started to modify it by my own.
I realized some system facts seems to be not undisclosed yet, so I first reworked the “legic decode” command to work properly with all possible kind of cards, so now it will show decoded card data correctly (and will not simply crash in most cases). Please take attention to the DCF field there. It seems it’s currently just known as a decremental counter field.

As second part the DCF writing did not worked. This is because of the fact it needs to be written in a consecutive session of write commands in reversed order, byte 6 first, than byte 5. But writing more than one byte in a run failed (with my Proxmark3 from Elechouse). I tuned around the timings and DCF writing is working now. I also modified the code from @Icsom (he first tried to fix this issue) to work even better… so you can write data in any way – if the data area will include the DCF data fields the code will manage the proper writing of this field on its own. But in fact, more than 2 bytes at a run will not work most of the time… 3 bytes will fail most of the time, 4 bytes will only work with much luck. I think this is a hardware issue rather than a software issue. I know from developing with official Legic chipsets prime handling is a little bit difficult. Couz of this fact most Legic readers will rise up the RF power in case of writing actions. A second issue with the PM3 hardware is the RF part itself: There is no proper line termination at all. This will result in reflactions, especially hard to handle on write actions. Also the antennas should have a parallel resistor to its winding to lower its Q factor. High Q factors results in more RF power of course, but the whole system gets very fragile. Sadly this is not possible with the PM3 hardware, since the RF power gets too low to work properly than… Damn situation. Older cards, like old MIM22, draw so much RF power they get not even detected in the field. I didn’t go deeper in this issue, but I think Legic prime will never work very well with this hardware base. I just stopped at the point DCF writing worked for me… (so I may wrong with my conclusion)

Okay… as noted above I wanted to write master token… find the problem smile Yes, how to create such master token data to later write it down on chip…? smile At this point I have to note @Icsom did a great LUA script for Legic Iceman noticed me yesterday. I never looked into it… so I created my own little tool to generate master tokens from blank media dumps. Anyway, this was not just wasted time in the end, since I saw the master token implementation done by Icsom are not complete. My tool includes “all the magic” to create any kind of available master tokens, except special things like limited IAMs (I may give some information just on request). So now you will be able to create: GAM, IAM, XAM and SAM cards like you could do (expect the noted “specials”) with official Legic software. I even added the reason why some here failed to generate GAMs with OL=1 wink

So I think my commit has its best value on an even better understanding of the Legic stuff, rather than giving huge functional improvements. I just wanted to see what’s possible with PM3 currently and realized, since master tokens could now be generated in any way, the prime system is fully hacked. There’s no kind of security anymore… okay simulation is not working currently, so the UID is quit “safe” currently, but this is just a PM3 issue.

Finally the stuff: (I have to note this is my first open source commit ever since decades)

Here’s a precompiled windows binary package (I’m some kind of Linux hater): https://www.sendspace.com/file/sciws5
This is basicly Iceman’s github-source (downloaded 4 days ago), merged together with Icsom’s github-source and modified on the way to work together. Finally I added the descripted stuff.

Here’s the source package the binaries are made from: https://www.sendspace.com/file/za72uw

Here’s the source of my “LegicCrcCalc” tool (included as binary in the binary package): https://www.sendspace.com/file/rcjjch
The name is couz I started with a tool just recalculating the MT header CRC (I always edited the values manually). Finally I extended this tool with general master token creation functions for this community.

Since this source failed to compile with the old “ProxSpace” MinGW package I created a updated one for my needs. If someone want to get it, just request it (It’s about 360MB, I don’t want to upload this without a reason).

Finally I want to thank Iceman for his work to find the Legic CRC parameters, making it possible to create Legic cards this way. Maybe we start to find the parameters for CRC16 mode of data segments now? wink

Offline

#2 2016-07-26 19:48:39

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

Great work Jason!
I've to merge your distributed code into my fork ASAP,  but it would have helped if you used the github for this smile

Regarding crc16 on data segments,  can you give me a complete datasegment with crc in it and the UID+uidcrc.  I need it to verify my changes.

And I can't take credit for something other people did before me,

@Szakalit here on the forum solved it 2012, http://www.proxmark.org/forum/viewtopic.php?id=1228  where he also mention that he's figured out the crc16. 

What I did was just implement the crc8 into the PM3 source which is way less fancy.  smile

Offline

#3 2016-07-27 15:10:12

mosci
Contributor
Registered: 2016-01-09
Posts: 94
Website

Re: Some Source enhancements (basicly master token creation)

unfortunately I won't download packages, that contain binaries from insecure sources ( yes I'm sometimes kind of paranoid )
so, I wait until Iceman has merged it into his branch - so I can read and compile it on my Linux Box ;-)
but I'm pretty curious about it!

Jason wrote:

Maybe we start to find the parameters for CRC16 mode of data segments now? wink

CRC16 on Legic Prime Data Segments?
Have never seen those ... on all Tags I have seen CRC8 was used only.
CRC16 was only used for Transport (consistency check) between Host and Security-Module and that is a ordinary CRC-CCIT (CRC16) but you can set it to 0x0000, then it will be ignored by the security-module.

from the documentation:

If a host system is not able to calculate a 16 bit CRC (e.g. low performance CPU), the CRC value in the command 
can be set to 0x0000. In this case, the CRC is ignored by the SM-4200 and no consistency check is performed.
As an alternative, the CRC value can be calculated in advance and added to the command as a static number.
The answer of the SM-4200 always contains a CRC value.
unsigned int CRC16(unsigned char * p, int n)
{
  int i;
  int k;
  unsigned int polynom = 0x8408;
  unsigned int crc = 0xFFFF;
  for (i = 0; i < n; i++)
  {  
    crc ^= p[i];
    for (k = 0; k < 8; k++)
   {
    // CCITT
    // inital value
    if(crc&1) crc=(crc>>1)^polynom;
         else          crc = crc >> 1;
    }
  }
  return crc; 
}

Last edited by mosci (2016-07-27 15:18:53)

Offline

#4 2016-07-28 14:45:41

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

mosci wrote:

unfortunately I won't download packages, that contain binaries from insecure sources ( yes I'm sometimes kind of paranoid )

Oh, this is not paranoid wink I use to run unknown code always in VMs where all changes are lost in space on closing them wink Safty first...

Anyway: You could compile the source on Linux, this works. I did it in a VM for the first time.


mosci wrote:

CRC16 on Legic Prime Data Segments?

Yes! ... what you talking about is the successor of the simple LRC from low-level communication with official Legic reader modules/chips. The older advant 2000 series have both choices: LRC and CRC. The 4000 series only CRC.
The 0x0000-Statement doesn't work, yes. This only worked the first few (beta) releases. Some day they realized a such fucking MCU have problems with CRC calculation will mess up more things than just the communication...

Anyway: You can use two kinds of CRCs in data segments. 8-bit and 16-bit. This is couz of the reason a larger data chunk is not really well protected with a 8 bit CRC - there could be a couple of bit flips giving the same result, this risk will be even higher on larger data chunks. The crc16 is no new stuff, it exits for a long time now... I think even SM-05 modules are able to handle this (should I verify next time...). This have nothing to do with advant.


iceman wrote:

but it would have helped if you used the github for this

You are right wink I'm so sorry! I'm some kind of a old shool SVN user wink I never tried to get in touch with GIT. I used WinMerge to merge your sources with Icsom's, so I was able to modify it on the way directly in this editor. So there was no need for versioning stuff for me. I maybe should give GIT a try some day wink

iceman wrote:

Regarding crc16 on data segments,  can you give me a complete datasegment with crc in it and the UID+uidcrc.

Of course: Freshly generated with a demo GAM:

RAW data:

40 f2 48 fc ed 60 ea 9f
ff 00 00 00 11 01 21 80
08 40 2a c1 00 00 cc 2d
e4 ad 92 ed ec ef ee ff
d9 bb 40 87 c8 11 3a b7
a9 8b 35 e1 46 35 ed ed
ed ed ed ed ed ed ed 00
00 00 00 00 00 00 00 00

(I stripped the unused area)

And the decoded data:

CDF: System Area
------------------------------------------------------
MCD: 40, MSN: f2 48 fc, MCC: ed OK
DCF: 60000 (60 ea), Token Type=IM-S (OLE=0)
WRP=15, WRC=1, RD=1, SSC=ff
Remaining Header Area
00 00 00 11 01 21 80 08 40 2A C1 00 00 

ADF: User Area
------------------------------------------------------
Segment 01 
raw header | 0x21 0xC0 0x09 0x40 
Segment len: 33,  Flag: 0xC (valid:1, last:1), WRP: 09, WRC: 04, RD: 0, CRC: 0x7F (OK)
WRC protected area:   (I 27 | K 0 | WRC 4)

row  | data
-----+------------------------------------------------
[00] | 00 01 02 03
Remaining write protected area:  (I 31 | K 31 | WRC 4 | WRP 9  WRP_LEN 5)

row  | data
-----+------------------------------------------------
[00] | 12 34 56 AD 6A
Remaining segment payload:  (I 36 | K 36 | Remain LEN 19)

row  | data
-----+------------------------------------------------
[00] | 25 FC D7 5A 44 66 D8 0C AB D8 00 00 00 00 00 00
[01] | 00 00 00
-----+------------------------------------------------

(by the way I found the "decode" function will not put the hex-bytes in the log... something to fix for me big_smile)


I added a well known KGH, expect the fact I used a crc16 instead of crc8. This crc16 are the last 2 bytes after the BCD encoded card number (AD-6A), I expluded WRC/WRP/RD flag in calculation (so it's just UID, stamp and card number 123456). Than I added some random data and put another crc16: AB-D8.

Offline

#5 2016-07-28 19:24:08

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

@jason   Did you only change the "cmdhflegic.c" file?


[edit]  nop, you changed some more.   I think I got your changes now into my fork. Haven't tested them yet

Offline

#6 2016-07-29 12:52:52

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

Yes, I did wink

... btw: I will post some data from Legic advant media in the other thread in the next few minutes smile

Offline

#7 2016-07-29 13:29:20

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

Would you mind verifying my fork if it got your stuff correct?

Offline

#8 2016-07-29 13:36:21

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

Ok, I will check it this evening or tomorror, no problem smile

Offline

#9 2016-07-30 19:46:10

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

I did a quick review... have no PM3 currently to let the code run (sitting on another desk at the moment wink).
I found, mybe a mistake:

Linle 71 of cmdhflegic.c:

	GetEMLFromBigBuf(data_buf, sizeof(data_buf), 0);

I changed this to GetFromBigBuf, since I got not data (read before) in the buffer... it was just filled with zeros that way. On Monday I will check the firmware running on a PM3.

btw I saw you added your reworked code for signal interpretation. I will take a closer look next week on this... hmmm... I didn't take a closer look yet, but would it be possible the processor is slightly to slow for this kind of code?

Last edited by Jason (2016-07-30 19:46:36)

Offline

#10 2016-07-30 19:55:36

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

O, no,  the part with the signal interpretation is definitly unfinished,   nothing makes sense to me how the correlated I/Q signal comes from the fpga,  like how long and what does the values mean...   

I tried to get ride of the old BigBuff handeling in the legic code, its not changed to deal with how the bigbuffer can be used today (or since a year back when Piwi re-wrote it)

Offline

#11 2016-08-02 17:31:21

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

Sorry Iceman for my late replay... less time currently sad

I compiled your fork and flashed it to my PM3... but something is screwed up. I always get CRC erros on reading a card:

#db# setting up legic card
#db# MIM256 card found, reading card ...
#db# !!! crc mismatch: expected c but got e !!!
#db# operation aborted

I've reverted your changes done on the CRC routines (replaced LegicCRC8) but this gives no better results. I didn't go any deeper yet, since I don't ave enought time today. But for now I can say: The current implementation breaks the Legic stuff.
I will go deeper in this issue this week (maybe already tomorror).


Edit:
Ohh... I just reviewed your CRC code changes in a too fast way: After reverting the changed to crc_update too, the legic reading works. But as I stated in the post above: "hf legic decode" gives just zero-values:

pm3 --> hf legic decode

CDF: System Area
------------------------------------------------------
MCD: 00, MSN: 00 00 00, MCC: 00 Fail
DCF: 0 (00 00), Token Type=IM (OLE=0)

ADF: User Area
------------------------------------------------------
Unsegmented card - WRP: 00, WRC: 00, RD: 0
Remaining segment payload:  (I 8 | Remain LEN 1002)

row  | data
-----+------------------------------------------------
[00] | 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[01] | 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[02] | 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[03] | 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
(...)

The GetEMLFromBigBuf needs to be GetFromBigBuf (I din't go any deeped in the difference yet... I'm new to PM3, I have to learn a lot about the routines available wink )

Your CRC changes screwed up the CRC calculation, maybe the changed POLY is wrong or the reflect-stuff will break correct calculation.

Last edited by Jason (2016-08-02 18:06:30)

Offline

#12 2016-08-02 19:20:09

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

There are more bugs, I'll take care of them one by one.

The crc8legic should be the same still after my changes.  Ive tested against UID and what the MCC should be. So that is a strange one.

Easy to test,  'hf legic crc b uidbytes'  should give you the correcct mcc

pm3 --> hf legic crc h
Calculates the legic crc8/crc16 on the input hexbytes.
There must be an even number of hexsymbols as input.
Usage:  hf legic crc8 [h] b <hexbytes> u <uidcrc> c <crc type>
Options:
      b <hexbytes>  : hex bytes
      u <uidcrc>    : MCC hexbyte
      c <crc type>  : 8|16 bit crc size

Samples:
      hf legic crc8 b deadbeef1122
      hf legic crc8 b deadbeef1122 u 9A c 16
pm3 -->

edit:
I think the crc is ok, its the data fetching from the device which brings zero, which makes all crc calcs wrong.

Offline

#13 2016-08-03 08:29:03

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

@jason,   https://github.com/iceman1001/proxmark3/commit/a28d34f407dc172f1e5b0dc257e8fc89e67706f0

The 0x07 >> 4 will always clear it out.  Are you sure this shift is correct?

[edit]
Found other place,  the correct mask is 0x70 >> 4,  fix pushed to repo.

Offline

#14 2016-08-05 17:10:35

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

Sorry for my late replay (... the time.....): Yes, you are right: I messed up this line. It is 0x70 not 0x07 - else this makes no sense, as u said smile

And for the CRC: You are right, the LegicCRC8 routine is working, but the raw data pakets are checked by a 4 bit CRC. This routine will do the job:

/* calculate crc for a legic command */
static int LegicCRC(int byte_index, int value, int cmd_sz) {
	crc_clear(&legic_crc);
	crc_update(&legic_crc, 1, 1); /* CMD_READ */
	crc_update(&legic_crc, byte_index, cmd_sz-1);
	crc_update(&legic_crc, value, 8);
	return crc_finish(&legic_crc);
}

... i tried to fix this, but I had no luck. What I noticed while tracing both (old and your new) variants line by line: The new code have some trouble with bit based calculations as shown above. I always get false results this way. Pumping for example complete bytes (8 bit) the routine (the new one) work fine.
Since I don't have more time currently I stopped here for now... I just want to let you know about this.

Last edited by Jason (2016-08-05 17:12:04)

Offline

#15 2016-08-06 21:47:09

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

I also have to fix the bigbuff mem allocations in my changes.

and thanks for the heads-up on crc4 calc. I'll take a look at it.

Offline

#16 2016-08-10 09:22:08

Jason
Contributor
Registered: 2016-07-21
Posts: 55

Re: Some Source enhancements (basicly master token creation)

Oh, you rewriting the the bigbuf stuff? I started also... are you started with this part? Seems to make no sense to do this job twice...

btw: I found some funny facts about the advant stuff we writte about in the other thread. Would it be of any interest to make a new thread about advant where a start to write the "known facts" about it, as some kind of a starting point for further attacs?

Offline

#17 2016-08-10 09:33:18

iceman
Administrator
Registered: 2013-04-25
Posts: 9,495
Website

Re: Some Source enhancements (basicly master token creation)

I'm all positive to keeping threads clean and to the subject. 

So do please start a new thread. This way it will be easier to follow.

Offline

Board footer

Powered by FluxBB