-
• #2252
You just need to take time and debug the shit out of this. Review the errors - I usually set TYPE as TEXT in those columns and filter out any shit data then change type afterwards. Welcome to the world of shit big data that shouldn't be anywhere near a spreadsheet.
-
• #2253
.
-
• #2254
What are you trying to do?
Why are you joining these two CSVs?It's sometimes better to do all your stuff to the tables (removing columns etc) and then join.
You can always group a table by columns and sum a helper column to get a count of instances, you can join this back to the step before group by so you can get a count of the number of times this appears. -
• #2255
What are you trying to do?
The same excerses as before - 1. identifying mismatching pairs, 2. work out the correct value for 'field X' in the mismatched pair, 3. Fill an upload sheet with different named headers so it can be sent for remediation.
Why are you joining these two CSVs?
The "Report" is made up of these two CSVs - reportdateA = the first 70k rows, reportdatea = the next 69kish rows.
So I could do the UID and column deletes in both, the lookup for 'field X' has to be in a combined sheet.
I also need to do some other filters like when field x for a particular role has a date <today then ignore, and flag when certain columns = 'yes' with some other conditions.
-
• #2256
You've got errors in your source data. You need to sort that out before progressing because you'll just keep carrying the errors through.
Click on the 1,790 errors, work out what is causing the errors, change types or fix some other way on the source, refresh and see if it imports with no errors. If you are still getting errors then repeat. Getting rid of them all can take a few goes. Then do the same for the other source. Get it clean before doing anything else.
-
• #2257
Yeah, if those 1,790 errors are important then they're worth fixing. If they're all in a field you don't need, kill the field and move on.
They're almost certainly something like this: you have a date field, it should be yymmdddd, what you have is "Tuesday - 10pm" or some free text or something.If you identify the fields you need this will help you later as well.
one of the first steps of importing the data, powerquery helpfully guesses what the contents of the field are (based on 1000 or whole data set). it'll usually be the step that has "set type" or something similar. you can usually get rid of this and have it as any and then set them manually after reviewing the dataset.
-
• #2258
As above.. then you get to my mornings work..
1 Attachment
-
• #2259
What. The. Actual. Fuck.
I've gone back to Excel and all my formulas return "0".
Genuinely about to loose it now.
Definitely a good lesson in not trying anything new unless you have bags of free time.
Edit Restarted everything and it seems to be back to normal
-
• #2260
Copilot is always telling me to use index and match to look up.
I want to make sure I'm looking up against every single row and also the lookup not stopping when it hits blank rows.
Is it correct, because it seems to absolutely love it as a suggestion.
-
• #2261
I think it’s based on how many times it’s been used in the material to train copilot.
Xlookup does the same now. -
• #2262
I love xlookup, but I thought it stopped when it hit the first instance, rather than carrying on down the column?
-
• #2263
isn't that where countif or sumif with xlookup becomes useful?
I'm.a bit rusty after 6 months out the game so may be wrong.
-
• #2264
Maybe.
I've been using Copilot a lot because it's inside the client organisation I'm working in, so as I'm going through it's so much easier to write it down verbally. For me anyway.
Previously I'd check stuff using my company machine on chatgpt but change all the names of stuff. Luckily I can paste into my client's VM so that reduced a bit of effort, but copilot has been so much easier. But I just don't really trust it.
-
• #2265
That's probably because people used to have a weird hard-on for that method Vs vlookup as it was slightly more efficient in certain edge cases.
Just use xlookup.
-
• #2266
Is it possible to produce a report of dependents / precedents rather than try to unpick fuckloads of coloured arrows?
-
• #2267
There's various paid add-ins like this https://jkp-ads.com/reftreeanalyser.aspx and I think KuTools has one too but no easy free way so far as I'm aware.
I did look at doing something in VBA when I wanted it but quickly gave up.
-
• #2268
understood, thanks.
I need to deconstruct & reconstruct a Howl's Moving Castle style workbook so will just have to take great care in doing do.
-
• #2269
One trick is to enter
=1:1048576
in a spare cell which makes the cell refer to the whole worksheet. You'll get an error message when you do it but ignore it. Then trace precedents a few times on that cell and it will give you arrows for all the active cells in the worksheet.
-
• #2270
thanks I'll give that a go next week
-
• #2271
- I'm using xls as an action tracker with multiple sheets
- certain columns have conditional formatting
- I'll make pivots off the information
- I'll copy and paste some of the tracker tables into ppt on a regular basis
- this will ultimately be owned by other people.
Question
Will formatting as a table make life difficult down the line?Pros: consistent formatting as rows are added
Cons: ?
Is formatting as a table going toCheers
- I'm using xls as an action tracker with multiple sheets
-
• #2272
this will ultimately be owned by other people.
No it won't.
-
• #2273
I don;t think, happy to be very wrong here, you can apply conditional formatting and have that come back in a pivot.
You could do a "if this is that then 1,0" in your table and have that value come back in a pivot. -
• #2274
Cons:
People get really confused about tables, breaks them and then complains things aren't working.
If you pass it on it will get broken/"improved" regardless unless you lock stuff.But having the pivot range auto-update is probably worth those problems. There are other clever ways to do that using Name Manager using the Offset formula but it will be even less understandable to others if it breaks..
-
• #2275
unless you lock stuff.
Pretty sure if protect sheet is used then new rows added dont expand the table range iirc.
I just need to do a YouTube course / tutorial as this is just not working for me.
I have:
But the whole thing won't load (black=what I want, red=source, purple=removed column I thought was the problem)
It feels like there is a fundamental concept I am missing about how PQ works. I literally just want the sheet for the work I've done thus far so I can do my lookups as I'm getting no where. But now I can't even do that.
I was hoping I'd be able to do 90% of my steps on this stupid massive sheet to get it down to a manageable size, while also having each step neatly labeled and also being able to copy the code from the Advanced Editor so I can perform all these steps again with new data ( the 2 source csv have a consistent name with each days date).
Now thoroughly depressed. But have also locked onto it and am struggling to let it go.
1 Attachment