Questions tagged [databricks-community-edition]

85 questions
0
votes
1 answer

How to group by 30 minutes interval in Databricks SQL

This is the function I was using to group by 30 mins of intervals in SQL: convert(time(0),dateadd(minute,(datediff(minute,0,a.Datetime)/30)*30,0)) where for example Datetime is 2023-03-09 00:26:01.6830000 grouped as 00:00:00. First column values are…
0
votes
0 answers

Working with "Databricks Community Edition" using visual studio code in 2023

is there a way, in 2023, to work with DATABRICKS COMMUNITY EDITION using VS code? I understand that there is NO TOKEN generation available for community edition. But is there a work around (back door kind of thing) or something? If you have any tips…
0
votes
0 answers

Databricks Community Error: SyntaxWarning: "is" with a literal. Did you mean "=="?

Whenever I execute any code on Databricks Community Edition with "is" I get the following error: SyntaxWarning: "is" with a literal. Did you mean "=="? For example, the following code will generate the error: if len(chkRows) is 0: I have done some…
Patterson
  • 1,927
  • 1
  • 19
  • 56
0
votes
1 answer

Databricks code does not work anymore with 'directory not found' error

This is from an SO-question some 4 years ago. It worked in Databricks Notebook. %python import pandas as pd from io import StringIO data = """ CODE,L,PS 5d8A,N,P60490 5d8b,H,P80377 5d8C,O,P60491 """ df = pd.read_csv(StringIO(data),…
thebluephantom
  • 16,458
  • 8
  • 40
  • 83
0
votes
1 answer

Displaying large dataframe in pyspark databricks

I am trying to display all values in a table with 50000 rows but get the error: java.lang.OutOfMemoryError: Java heap space is there a way to increase memory to avoid this issue? (apologies if this is a simple question I'm very new to this) the…
0
votes
1 answer

Unable to add azure key vault backed secret scope in community databricks edition

I am Trying to add secret scope (Azure key vault backed) in Databricks Community version but giving me below error when I check its response it is saying ENDPOINT NOT FOUND.
0
votes
1 answer

Saving pyspark dataframe from azure databricks to azure data lake in a specific folder created dynamically

I am doing some ETL process in Azure. 1. Source data is in Azure data lake 2. Processing it in Azure databricks 3. Loading the output dataframe in Azure data lake to a specific folder considering Current year / Month / date and then file name in…
0
votes
1 answer

Do databricks secrets work with community edition?

I'm trying to create an ETL project for resume fluff. The idea is that I would have real estate data web-scraped with python (done), then cleaned with pyspark in databricks (done), then have the dataframe pushed to AWS DynamoDB and maybe output to a…
0
votes
2 answers

Not Able To HIVE databases in Databricks Community Edition

I have been using Databricks Edition for many years now, and suddenly I'm no longer able to view HIVE databases. I'm getting the following message when I select Data I normally see my HIVE database here. Has there been a massive change?
Patterson
  • 1,927
  • 1
  • 19
  • 56
0
votes
2 answers

To Mount onedrive for business in databricks

I am trying to mount a folder in one drive business in databricks community edition. I am unable to use onedrivesdk because it is deprecated. I created a app registration, assigned read and write permissions to that and using the client id and…
0
votes
0 answers

How read .shp files in databricks from filestore?

I'm using Databricks community, and I save a .shp in the FileStore, but when I tried to read I get this error: DriverError: /dbfs/FileStore/tables/World_Countries.shp: No such file or directory this is my Code import geopandas as gpd gdf =…
BryC
  • 89
  • 6
0
votes
1 answer

How to rewrite pandas.rolling().rank(method='average')

I have to rewrite Pandas to Spark. I have problem with this part of code: #struct_ts is Spark DataFrame #columns_to_take_pctile = #period = 65 (days) / 130 / 260 / .... grp = struct_ts.groupby(["struct",…
0
votes
0 answers

How to read configuration file and pass the details to code in databricks environment

I want to read config file and use that file in notebook.How to do that? I am not getting how to read config file in databricks
0
votes
2 answers

Unable to execute Databricks REST API for data copy using Python

When i am executing the below code to "copy data from databricks --> local" its failing with an error. Can anyone please help me with how to solve this error. import os from databricks_cli.sdk.api_client import ApiClient from…
0
votes
0 answers

Automate - data import from databricks to local using databricks CLI

I have configured databricks-CLI and its working fine for the copy to local command. databricks fs cp dbfs:/FileStore/data/filename.csv C:\Users Can i automate this in python. Using "import databricks_cli". Please let me know how to do it?