-
• #2127
yeah, but i'm going for simplicity here....
-
• #2128
15 digit+ numbers aren't stored as integers by default in Excel, so the exact value is lost.
If you don't need to do any maths on them in Excel, you should be fine storing them as text, which will preserve the exact value.
[Edit: it's apparently possible to use DOUBLE ints, but above is probably the simplest solution for the given problem]
-
• #2129
Does anyone have a sense of how hard(?) this will be for Excel:
- I have a list of 2-200 rows with Ids (list A) that will vary from time to time, and a list of 16k rows (list B) that will be static (or sometimes updated)
- I want to lookup ids in list A to check if they appear in list B.
Will looking up against a sheet with 16k rows be fine or will it constantly fuck up?
Cheers.
- I have a list of 2-200 rows with Ids (list A) that will vary from time to time, and a list of 16k rows (list B) that will be static (or sometimes updated)
-
• #2130
Here's what I do when I deal with csv data with very long numbers.
Save the csv locally, then Date > Get Data > Legacy Wizard (From Text) > delimit by comma and then chose to set the data format as Text for the problem data.
-
• #2131
POWER QUERY
-
• #2132
yeah, i have a feeling the way i'm going is:
save file as csv
get data from csv into table from power query -
• #2133
No 😊
It needs to be a suggested tactical solution for the semi-inept. I can ask them to do a look up on sheet A before uploading, but I can't really ask more than that.
I guess I could find out if they are allowed macros.
But we're saying it'll shit the bed.
Ah! Back to the drawing board.
-
• #2134
Seems easy enough to test both from a technical and user POV.
-
• #2135
I don't think it will be that hard.
Intensive maybe.Determining whether each of 200 numbers is present in a static list of 16,000 numbers in Excel can be achieved using various methods, but the efficiency and difficulty can vary depending on the approach you take.
Here are a few options, ranging from simple to more complex:
Using Excel Functions (easiest):
- You can use Excel's built-in functions like
VLOOKUP
,MATCH
, orCOUNTIF
to check if each number exists in the list. - However, these functions might become slow when dealing with large datasets, especially with 16,000 numbers.
- You can use Excel's built-in functions like
Conditional Formatting:
- You could apply conditional formatting to highlight cells where the number exists in the list. This won't give you a direct list of matches but can visually identify them.
- This method may also slow down with a large list of numbers.
- You could apply conditional formatting to highlight cells where the number exists in the list. This won't give you a direct list of matches but can visually identify them.
VBA Scripting (more advanced):
- Writing a VBA script (macro) can automate the process and might be more efficient for large datasets.
- The VBA script can loop through each of the 200 numbers and check if it exists in the list of 16,000 numbers.
- This method provides more control and potentially better performance but requires some programming knowledge.
- Writing a VBA script (macro) can automate the process and might be more efficient for large datasets.
Advanced Data Analysis with Power Query or Power Pivot:
- If the data is frequently updated and you need real-time analysis, Power Query or Power Pivot might be more suitable.
- You can import both lists into Power Query, merge the tables based on the numbers, and filter for matches.
- This method is more complex to set up but offers powerful data analysis capabilities.
- If the data is frequently updated and you need real-time analysis, Power Query or Power Pivot might be more suitable.
In terms of difficulty, using Excel functions might be the easiest to set up but could be slow with large datasets. VBA scripting offers more control and potentially better performance but requires programming skills. Power Query or Power Pivot provides advanced analysis capabilities but has a steeper learning curve. Ultimately, the difficulty depends on your familiarity with Excel and programming.
-
• #2136
True. It's just getting 16k rows of data / being able to download a file from somewhere.
Actually I guess I could take one of the massive files I have, put it in pq and duplicate it until I hit 16k.
-
• #2137
Cheers for the detailed response 👍
-
• #2138
I thought about the scan function, but i don't know how it would work here. COUNTIF will.
-
• #2139
Easy and quick. That's pretty small datasets, it took a fraction of a second to calculate on my 6 year old laptop.
Just use an xmatch or whatever you want to get a clean result.
-
• #2140
I imported each file as csv via power query.
I know there’s a way to import all folder contents and then split by source name to separate tables but I was on one screen and had a dog on the keyboard. -
• #2141
Hoping someone can help me... it's a bookkeeping type of thing.
I have a spreadsheet, dates in column 1, then about 20 subsequent named columns, each column having values (in £s) in it if money has been spent on particular dates.
If money was spent on the same date under different headings then that date is repeated in the date column.I need to create a line graph for each named column showing how the weekly average value (money spent) changes over time.
I can do the line chart bit, but I'm stuffed if I know how the fucking fuck to take all the values in each column and change them into a rolling average value for any given week.
Please help!
-
• #2142
That's a horrendous layout for doing anything with. Much better would be a date in column A, a category in B (use a drop down to make sure they're consistent) and a value in C.
Easiest way to get to this is to unpivot using power query.
This is a random guide I found on Google
https://exceloffthegrid.com/power-query-unpivot-data/
But searching "unpivot using power query" will give you plenty of guides -
• #2143
Hey nice one, I'll look into that and see if I can things easier that way.
I can do a lot of basic stuff but I really struggle with the layout prep, and more than that, the terminology - it seems that if you know the lingo it's all a lot clearer -
• #2144
This helps
https://youtu.be/187xEPVg1Gc
Start laying data out in this way (as described by aggi) and then it becomes easy.
You are a data reading robot make your life easier you are not a human capable of inference and with human preferences. -
• #2145
Chatgpt or copilot or whatever is good for typing in what you want and then look at the answer for some key terms.
-
• #2146
Is there a formula for checking spelling in Excel? Seems like there should be but searching just brings up the normal spellcheck method.
I could do a custom function with:
Application.CheckSpelling(Word:=cll.Text)
but I'd prefer not to.
-
• #2147
How many rows, words, etc?
I think you have to write a function -
• #2148
Yep, macro seemed to be the option.
New challenge, I've got a spreadsheet that people shouldn't have general access to so it is stored in a folder on Onedrive with limited permissions. However, I do want to create a spreadsheet that links to it that anyone can refresh. I need to double check but I think this fails on refresh from when I tried to do it before. Any tips?
-
• #2149
Have you tried flow?
-
• #2150
This is asking for trouble. I would split the public data out into a public spreadsheet.
If it needs to be updated from private data, automate that under a privileged account.
Edit: I have no idea how to do this in OneDrive, TBH
Opening in power query should fix that though shouldn't it? Format the raw CSV as needed