Compare commits

..

8 Commits
1.8.0 ... 1.7.1

Author SHA1 Message Date
Greg Price
2e4ae9c5dc Release Zulip Server 1.7.1. 2017-11-22 16:19:48 -08:00
Greg Price
139fb8c2ee changelog: Document 1.7.1 release. 2017-11-22 14:56:26 -08:00
Greg Price
93ffaa73bd i18n: Update translations (including complete Korean!) 2017-11-22 14:42:55 -08:00
Vishnu Ks
960d736e55 registration: Require an explicit realm on PreregistrationUser.
This completes the last commit's work to fix CVE-2017-0910, applying
to any invite links already created before the fix was deployed.  With
this change, all new-user registrations must match an explicit realm
in the PreregistrationUser row, except when creating a new realm.

[greg: rewrote commit message]
2017-11-22 14:42:48 -08:00
Vishnu Ks
28a3dcf787 registration: Check realm against PreregistrationUser realm.
We would allow a user with a valid invitation for one realm to use it
on a different realm instead.  On a server with multiple realms, an
authorized user of one realm could use this (by sending invites to
other email addresses they control) to create accounts on other
realms. (CVE-2017-0910)

With this commit, when sending an invitation, we record the inviting
user's realm on the PreregistrationUser row; and when registering a
user, we check that the PregistrationUser realm matches the realm the
user is trying to register on.  This resolves CVE-2017-0910 for
newly-sent invitations; the next commit completes the fix.

[greg: rewrote commit message]
2017-11-22 14:42:28 -08:00
Harshit Bansal
4eb958b6d8 purge-old-deployments: Be sure to preserve last, current, and next.
[greg: revised commit message]
2017-11-22 14:42:20 -08:00
Harshit Bansal
d35d5953c7 purge-old-deployments: Remove unnecessary path juggling.
This is a small refactor that simplifies the bugfix which follows.

[greg: revised commit message]
2017-11-22 14:42:06 -08:00
Tim Abbott
c256c5e91c install: Force a locale so our dependencies can install.
In some environments, either pip itself fails or some packages fail to
install, and setting the locale to en_US.UTF-8 resolves the issue.

We heard reports of this kind of behavior with at least two different
sets of symptoms, with 1.7.0 or its release candidates:
  https://chat.zulip.org/#narrow/stream/general/subject/Trusty.201.2E7.20Upgrade/near/302214
  https://chat.zulip.org/#narrow/stream/production.20help/subject/1.2E6.20to.201.2E7/near/306250

In all reported cases, this fixed it.  This change was included in
1.7.0-rc2 and went through testing there, but by mistake was omitted
from the 1.7.0 release.

[greg: cut LC_CTYPE; move `install` line from very top to
 an appropriate spot; rewrite comments and commit message.]
2017-11-22 14:39:59 -08:00
1847 changed files with 52822 additions and 107017 deletions

View File

@@ -1,145 +0,0 @@
# See CircleCI upstream's docs on this config format:
# https://circleci.com/docs/2.0/language-python/
#
version: 2
jobs:
"trusty-python-3.4":
docker:
# This is built from tools/circleci/images/trusty/Dockerfile .
- image: gregprice/circleci:trusty-python-4.test
working_directory: ~/zulip
steps:
- checkout
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- restore_cache:
keys:
- v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: install dependencies
command: |
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/travis/setup-backend" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
# TODO: in Travis we also cache ~/zulip-emoji-cache, ~/node, ~/misc
# The moment of truth! Run the tests.
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/backend ts
- run:
name: run frontend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/frontend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
# - store_artifacts: # TODO
# path: var/casper/
# # also /tmp/zulip-test-event-log/
# destination: test-reports
"xenial-python-3.5":
docker:
# This is built from tools/circleci/images/xenial/Dockerfile .
- image: gregprice/circleci:xenial-python-3.test
working_directory: ~/zulip
steps:
- checkout
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- restore_cache:
keys:
- v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: install dependencies
command: |
sudo apt-get install -y moreutils
rm -f /home/circleci/.gitconfig
mispipe "tools/travis/setup-backend" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/backend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
workflows:
version: 2
build:
jobs:
- "trusty-python-3.4"
- "xenial-python-3.5"

View File

@@ -5,6 +5,6 @@ coverage:
project:
default:
target: auto
threshold: 0.50
threshold: 0.03
base: auto
patch: off

View File

@@ -6,7 +6,7 @@ charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[*.{sh,py,pyi,js,json,yml,xml,css,md,markdown,handlebars,html}]
[*.{sh,py,js,json,yml,xml,css,md,markdown,handlebars,html}]
indent_style = space
indent_size = 4

View File

@@ -15,9 +15,9 @@
"XDate": false,
"zxcvbn": false,
"LazyLoad": false,
"Dropbox": false,
"SockJS": false,
"marked": false,
"md5": false,
"moment": false,
"i18n": false,
"DynamicText": false,
@@ -31,22 +31,13 @@
"popovers": false,
"server_events": false,
"server_events_dispatch": false,
"message_scroll": false,
"keydown_util": false,
"info_overlay": false,
"ui": false,
"ui_report": false,
"night_mode": false,
"ui_util": false,
"lightbox": false,
"input_pill": false,
"user_pill": false,
"compose_pm_pill": false,
"stream_color": false,
"people": false,
"user_groups": false,
"navigate": false,
"toMarkdown": false,
"settings_account": false,
"settings_display": false,
"settings_notifications": false,
@@ -56,13 +47,9 @@
"settings_sections": false,
"settings_emoji": false,
"settings_org": false,
"settings_ui": false,
"settings_users": false,
"settings_streams": false,
"settings_filters": false,
"settings_invites": false,
"settings_user_groups": false,
"settings_profile_fields": false,
"settings": false,
"resize": false,
"loading": false,
@@ -71,7 +58,6 @@
"typing_data": false,
"typing_status": false,
"sent_messages": false,
"transmit": false,
"compose": false,
"compose_actions": false,
"compose_state": false,
@@ -93,10 +79,8 @@
"gear_menu": false,
"hashchange": false,
"hash_util": false,
"FetchStatus": false,
"message_list": false,
"Filter": false,
"flatpickr": false,
"pointer": false,
"util": false,
"MessageListView": false,
@@ -161,23 +145,18 @@
"recent_senders": false,
"unread_ui": false,
"unread_ops": false,
"upload": false,
"user_events": false,
"Plotly": false,
"emoji_codes": false,
"drafts": false,
"katex": false,
"ClipboardJS": false,
"Clipboard": false,
"emoji_picker": false,
"hotspots": false,
"compose_ui": false,
"common": false,
"panels": false,
"PerfectScrollbar": false
"desktop_notifications_panel": false
},
"plugins": [
"eslint-plugin-empty-returns"
],
"rules": {
"array-callback-return": "error",
"array-bracket-spacing": "error",
@@ -197,7 +176,6 @@
"complexity": [ 0, 4 ],
"curly": 2,
"dot-notation": [ "error", { "allowKeywords": true } ],
"empty-returns/main": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": 2,
"func-style": [ "off", "expression" ],

2
.gitattributes vendored
View File

@@ -1,7 +1,6 @@
* text=auto eol=lf
*.gif binary
*.jpg binary
*.jpeg binary
*.eot binary
*.woff binary
*.woff2 binary
@@ -10,4 +9,3 @@
*.png binary
*.otf binary
*.tif binary
yarn.lock binary

View File

@@ -1,14 +0,0 @@
<!-- What's this PR for? (Just a link to an issue is fine.) -->
**Testing Plan:** <!-- How have you tested? -->
**GIFs or Screenshots:** <!-- If a UI change. See:
https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html
-->
<!-- Also be sure to make clear, coherent commits:
https://zulip.readthedocs.io/en/latest/contributing/version-control.html
-->

17
.gitignore vendored
View File

@@ -12,26 +12,19 @@
# * Subdirectories with several internal things to ignore get their own
# `.gitignore` files.
#
# * Comments must be on their own line. (Otherwise they don't work.)
#
# See `git help ignore` for details on the format.
## Config files for the dev environment
/zproject/dev-secrets.conf
/tools/conf.ini
/tools/custom_provision
/tools/droplets/conf.ini
## Byproducts of setting up and using the dev environment
*.pyc
package-lock.json
/.vagrant
/var
# Dockerfiles generated for CircleCI
/tools/circleci/images
# Static build
*.mo
npm-debug.log
@@ -44,11 +37,6 @@ npm-debug.log
# Test / analysis tools
.coverage
## Files (or really symlinks) created in a prod deployment
/zproject/prod_settings.py
/zulip-current-venv
/zulip-py3-venv
## Files left by various editors and local environments
# (Ideally these should be in everyone's respective personal gitignore files.)
*~
@@ -63,11 +51,6 @@ zulip.kdev4
*.sublime-workspace
.vscode/
*.DS_Store
# .cache/ is generated by VSCode's test runner
.cache/
.eslintcache
## Miscellaneous
# (Ideally this section is empty.)
zthumbor/thumbor_local_settings.py
.transifexrc

View File

@@ -1,10 +1,10 @@
[general]
ignore=title-trailing-punctuation, body-min-length, body-is-missing, title-imperative-mood
ignore=title-trailing-punctuation, body-min-length, body-is-missing
extra-path=tools/lib/gitlint-rules.py
[title-match-regex-allow-exception]
regex=^(.+:\ )?[A-Z].+\.$
[title-match-regex]
regex=^.+\.$
[title-max-length]
line-length=76

View File

@@ -1,10 +0,0 @@
[settings]
line_length = 79
multi_line_output = 2
balanced_wrapping = true
known_third_party = django, ujson, sqlalchemy
known_first_party = zerver, zproject, version, confirmation, zilencer, analytics, frontend_tests, scripts, corporate
sections = FUTURE, STDLIB, THIRDPARTY, FIRSTPARTY, LOCALFOLDER
lines_after_imports = 1
# See the comment related to ioloop_logging for why this is skipped.
skip = zerver/management/commands/runtornado.py

View File

@@ -1,11 +1,9 @@
# See https://zulip.readthedocs.io/en/latest/testing/travis.html for
# See https://zulip.readthedocs.io/en/latest/travis.html for
# high-level documentation on our Travis CI setup.
dist: trusty
group: deprecated-2017Q4
install:
# Disable sometimes-broken sources.list in Travis base images
- sudo rm -vf /etc/apt/sources.list.d/*
- sudo apt-get update
# Disable broken riak sources.list in Travis base image 2017-10-18
- rm -vf "/etc/apt/sources.list.d/*riak*"
# Disable Travis CI's built-in NVM installation
- mispipe "mv ~/.nvm ~/.travis-nvm-disabled" ts
@@ -35,7 +33,6 @@ cache:
- $HOME/zulip-npm-cache
- $HOME/zulip-emoji-cache
- $HOME/node
- $HOME/misc
env:
global:
- BOTO_CONFIG=/tmp/nowhere
@@ -48,9 +45,14 @@ matrix:
# that doesn't seem to be documented, but it's what we see empirically.
# We only get 4 jobs running at a time, so we try to make the first few
# the most likely to break.
- python: "3.4"
env: TEST_SUITE=frontend
- python: "3.4"
env: TEST_SUITE=backend
- python: "3.4"
env: TEST_SUITE=production
# Other suites moved to CircleCI -- see .circleci/.
- python: "3.5"
env: TEST_SUITE=backend
sudo: required
addons:
artifacts:
@@ -65,3 +67,9 @@ addons:
- moreutils
after_success:
- codecov
notifications:
webhooks:
urls:
- https://zulip.org/zulipbot/travis
on_success: always
on_failure: always

View File

@@ -13,21 +13,3 @@ source_file = static/locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json
[zulip.messages]
source_file = static/locale/en/mobile.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/mobile.json
[zulip-test.djangopo]
source_file = static/locale/en/LC_MESSAGES/django.po
source_lang = en
type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip-test.translationsjson]
source_file = static/locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json

View File

@@ -1,333 +0,0 @@
# Contributing to Zulip
Welcome to the Zulip community!
## Community
The
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html)
is the primary communication forum for the Zulip community. It is a good
place to start whether you have a question, are a new contributor, are a new
user, or anything else. Make sure to read the
[community norms](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html#community-norms)
before posting. The Zulip community is also governed by a
[code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
You can subscribe to zulip-devel@googlegroups.com for a lower traffic (~1
email/month) way to hear about things like mentorship opportunities with Google
Code-in, in-person sprints at conferences, and other opportunities to
contribute.
## Ways to contribute
To make a code or documentation contribution, read our
[step-by-step guide](#your-first-codebase-contribution) to getting
started with the Zulip codebase. A small sample of the type of work that
needs doing:
* Bug squashing and feature development on our Python/Django
[backend](https://github.com/zulip/zulip), web
[frontend](https://github.com/zulip/zulip), React Native
[mobile app](https://github.com/zulip/zulip-mobile), or Electron
[desktop app](https://github.com/zulip/zulip-electron).
* Building out our
[Python API and bots](https://github.com/zulip/python-zulip-api) framework.
* [Writing an integration](https://zulipchat.com/api/integration-guide).
* Improving our [user](https://zulipchat.com/help/) or
[developer](https://zulip.readthedocs.io/en/latest/) documentation.
* [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html)
and manually testing pull requests.
**Non-code contributions**: Some of the most valuable ways to contribute
don't require touching the codebase at all. We list a few of them below:
* [Reporting issues](#reporting-issues), including both feature requests and
bug reports.
* [Giving feedback](#user-feedback) if you are evaluating or using Zulip.
* [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip.
* [Outreach](#zulip-outreach): Star us on GitHub, upvote us
on product comparison sites, or write for [the Zulip blog](http://blog.zulip.org/).
## Your first (codebase) contribution
This section has a step by step guide to starting as a Zulip codebase
contributor. It's long, but don't worry about doing all the steps perfectly;
no one gets it right the first time, and there are a lot of people available
to help.
* First, make an account on the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html),
paying special attention to the community norms. If you'd like, introduce
yourself in
[#new members](https://chat.zulip.org/#narrow/stream/new.20members), using
your name as the topic. Bonus: tell us about your first impressions of
Zulip, and anything that felt confusing/broken as you started using the
product.
* Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
* [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html),
getting help in
[#development help](https://chat.zulip.org/#narrow/stream/development.20help)
if you run into any troubles.
* Read the
[Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html)
and do the Git tutorial (coming soon) if you are unfamiliar with Git,
getting help in
[#git help](https://chat.zulip.org/#narrow/stream/git.20help) if you run
into any troubles.
* Sign the
[Dropbox Contributor License Agreement](https://opensource.dropbox.com/cla/).
### Picking an issue
Now, you're ready to pick your first issue! There are hundreds of open issues
in the main codebase alone. This section will help you find an issue to work
on.
* If you're interested in
[mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue),
[desktop](https://github.com/zulip/zulip-electron/issues?q=is%3Aopen+is%3Aissue),
or
[bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue)
development, check the respective links for open issues, or post in
[#mobile](https://chat.zulip.org/#narrow/stream/mobile),
[#electron](https://chat.zulip.org/#narrow/stream/electron), or
[#bots](https://chat.zulip.org/#narrow/stream/bots).
* For the main server and web repository, start by looking through issues
with the label
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A"good+first+issue").
These are smaller projects particularly suitable for a first contribution.
* We also partition all of our issues in the main repo into areas like
admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look
through our [list of labels](https://github.com/zulip/zulip/labels), and
click on some of the `area:` labels to see all the issues related to your
areas of interest.
* If the lists of issues are overwhelming, post in
[#new members](https://chat.zulip.org/#narrow/stream/new.20members) with a
bit about your background and interests, and we'll help you out. The most
important thing to say is whether you're looking for a backend (Python),
frontend (JavaScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript + CSS) issue, and a
bit about your programming experience and available time.
We also welcome suggestions of features that you feel would be valuable or
changes that you feel would make Zulip a better open source project. If you
have a new feature you'd like to add, we recommend you start by posting in
[#new members](https://chat.zulip.org/#narrow/stream/new.20members) with the
feature idea and the problem that you're hoping to solve.
Other notes:
* For a first pull request, it's better to aim for a smaller contribution
than a bigger one. Many first contributions have fewer than 10 lines of
changes (not counting changes to tests).
* The full list of issues looking for a contributor can be found with the
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
label.
* For most new contributors, there's a lot to learn while making your first
pull request. It's OK if it takes you a while; that's normal! You'll be
able to work a lot faster as you build experience.
### Working on an issue
To work on an issue, claim it by adding a comment with `@zulipbot claim` to
the issue thread. [Zulipbot](https://github.com/zulip/zulipbot) is a GitHub
workflow bot; it will assign you to the issue and label the issue as "in
progress". Some additional notes:
* You're encouraged to ask questions on how to best implement or debug your
changes -- the Zulip maintainers are excited to answer questions to help
you stay unblocked and working efficiently. You can ask questions on
chat.zulip.org, or on the GitHub issue or pull request.
* We encourage early pull requests for work in progress. Prefix the title of
work in progress pull requests with `[WIP]`, and remove the prefix when
you think it might be mergeable and want it to be reviewed.
* After updating a PR, add a comment to the GitHub thread mentioning that it
is ready for another review. GitHub only notifies maintainers of the
changes when you post a comment, so if you don't, your PR will likely be
neglected by accident!
### And beyond
A great place to look for a second issue is to look for issues with the same
`area:` label as the last issue you resolved. You'll be able to reuse the
work you did learning how that part of the codebase works. Also, the path to
becoming a core developer often involves taking ownership of one of these area
labels.
## What makes a great Zulip contributor?
Zulip runs a lot of [internship programs](#internship-programs), so we have
a lot of experience with new contributors. In our experience, these are the
best predictors of success:
* Posting good questions. This generally means explaining your current
understanding, saying what you've done or tried so far, and including
tracebacks or other error messages if appropriate.
* Learning and practicing
[Git commit discipline](https://zulip.readthedocs.io/en/latest/contributing/version-control.html#commit-discipline).
* Submitting carefully tested code. This generally means checking your work
through a combination of automated tests and manually clicking around the
UI trying to find bugs in your work. See
[things to look for](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#things-to-look-for)
for additional ideas.
* Posting
[screenshots or GIFs](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html)
for frontend changes.
* Being responsive to feedback on pull requests. This means incorporating or
responding to all suggested changes, and leaving a note if you won't be
able to address things within a few days.
* Being helpful and friendly on chat.zulip.org.
These are also the main criteria we use to select interns for all of our
internship programs.
## Reporting issues
If you find an easily reproducible bug and/or are experienced in reporting
bugs, feel free to just open an issue on the relevant project on GitHub.
If you have a feature request or are not yet sure what the underlying bug
is, the best place to post issues is
[#issues](https://chat.zulip.org/#narrow/stream/issues) (or
[#mobile](https://chat.zulip.org/#narrow/stream/mobile) or
[#electron](https://chat.zulip.org/#narrow/stream/electron)) on the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html).
This allows us to interactively figure out what is going on, let you know if
a similar issue has already been opened, and collect any other information
we need. Choose a 2-4 word topic that describes the issue, explain the issue
and how to reproduce it if known, your browser/OS if relevant, and a
[screenshot or screenGIF](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html)
if appropriate.
**Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can email
zulip-security@googlegroups.com. We create a CVE for every security issue.
## User feedback
Nearly every feature we develop starts with a user request. If you are part
of a group that is either using or considering using Zulip, we would love to
hear about your experience with the product. If you're not sure what to
write, here are some questions we're always very curious to know the answer
to:
* Evaluation: What is the process by which your organization chose or will
choose a group chat product?
* Pros and cons: What are the pros and cons of Zulip for your organization,
and the pros and cons of other products you are evaluating?
* Features: What are the features that are most important for your
organization? In the best case scenario, what would your chat solution do
for you?
* Onboarding: If you remember it, what was your impression during your first
few minutes of using Zulip? What did you notice, and how did you feel? Was
there anything that stood out to you as confusing, or broken, or great?
* Organization: What does your organization do? How big is the organization?
A link to your organization's website?
## Internship programs
Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code (GSoC)](https://developers.google.com/open-source/gsoc/)
[1], and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram),
and has in the past taken summer interns from Harvard, MIT, and
Stanford.
While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in
mind:
* We try to make the application process as valuable for the applicant as
possible. Expect high quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of
whether you are selected.
* To apply, you'll have to submit at least one pull request to a Zulip
repository. Most students accepted to one of our programs have
several merged pull requests (including at least one larger PR) by
the time of the application deadline.
* The main criteria we use is quality of your best contributions, and
the bullets listed at
[What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
Because we focus on evaluating your best work, it doesn't hurt your
application to makes mistakes in your first few PRs as long as your
work improves.
Zulip also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). Our
selection criteria for Finalists and Grand Prize Winners is the same as our
selection criteria for interns above.
Most of our interns end up sticking around the project long-term, and many
quickly become core team members. We hope you apply!
### Google Summer of Code
GSoC is by far the largest of our internship programs (we had 14 GSoC
students in summer 2017). While we don't control how many slots
Google allocates to Zulip, we hope to mentor a similar number of
students in 2018.
If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is best way to have a
strong application. About half of Zulip's GSoC students for Summer
2017 had made significant contributions to the project by February
2017, and about half had not. Our
[GSoC project ideas page][gsoc-guide] has lots more details on how
Zulip does GSoC, as well as project ideas (though the project idea
list is maintained only during the GSoC application period, so if
you're looking at some other time of year, the project list is likely
out-of-date).
We also have in some past years run a Zulip Summer of Code (ZSoC)
program for students who we didn't have enough slots to accept for
GSoC but were able to find funding for. Student expectations are the
same as with GSoC, and it has no separate application process; your
GSoC application is your ZSoC application. If we'd like to select you
for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
[1] Formally, [GSoC isn't an internship][gsoc-faq], but it is similar
enough that we're treating it as such for the purposes of this
documentation.
## Zulip Outreach
**Upvoting Zulip**. Upvotes and reviews make a big difference in the public
perception of projects like Zulip. We've collected a few sites below
where we know Zulip has been discussed. Doing everything in the following
list typically takes about 15 minutes.
* Star us on GitHub. There are four main repositories:
[server/web](https://github.com/zulip/zulip),
[mobile](https://github.com/zulip/zulip-mobile),
[desktop](https://github.com/zulip/zulip-electron), and
[Python API](https://github.com/zulip/python-zulip-api).
* [Follow us](https://twitter.com/zulip) on Twitter.
For both of the following, you'll need to make an account on the site if you
don't already have one.
* [Like Zulip](https://alternativeto.net/software/zulip-chat-server/) on
AlternativeTo. We recommend upvoting a couple of other products you like
as well, both to give back to their community, and since single-upvote
accounts are generally given less weight. You can also
[upvote Zulip](https://alternativeto.net/software/slack/) on their page
for Slack.
* [Add Zulip to your stack](https://stackshare.io/zulip) on StackShare, star
it, and upvote the reasons why people like Zulip that you find most
compelling. Again, we recommend adding a few other products that you like
as well.
We have a doc with more detailed instructions and a few other sites, if you
have been using Zulip for a while and want to contribute more.
**Blog posts**. Writing a blog post about your experiences with Zulip, or
about a technical aspect of Zulip can be a great way to spread the word
about Zulip.
We also occasionally [publish](http://blog.zulip.org/) longer form
articles related to Zulip. Our posts typically get tens of thousands
of views, and we always have good ideas for blog posts that we can
outline but don't have time to write. If you are an experienced writer
or copyeditor, send us a portfolio; we'd love to talk!

46
LICENSE
View File

@@ -1,24 +1,3 @@
Copyright 2011-2017 Dropbox, Inc., Kandra Labs, Inc., and contributors
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
The software includes some works released by third parties under other
free and open source licenses. Those works are redistributed under the
license terms under which the works were received. For more details,
see the ``docs/THIRDPARTY`` file included with this distribution.
--------------------------------------------------------------------------------
Apache License
Version 2.0, January 2004
@@ -196,3 +175,28 @@ see the ``docs/THIRDPARTY`` file included with this distribution.
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

348
README.md
View File

@@ -1,83 +1,301 @@
**[Zulip overview](#zulip-overview)** |
**[Community](#community)** |
**[Installing for dev](#installing-the-zulip-development-environment)** |
**[Installing for production](#running-zulip-in-production)** |
**[Ways to contribute](#ways-to-contribute)** |
**[How to get involved](#how-to-get-involved-with-contributing-to-zulip)** |
**[License](#license)**
# Zulip overview
Zulip is a powerful, open source group chat application that combines the
immediacy of real-time chat with the productivity benefits of threaded
conversations. Zulip is used by open source projects, Fortune 500 companies,
large standards bodies, and others who need a real-time chat system that
allows users to easily process hundreds or thousands of messages a day. With
over 300 contributors merging over 500 commits a month, Zulip is also the
largest and fastest growing open source group chat project.
Zulip is a powerful, open source group chat application. Written in
Python and using the Django framework, Zulip supports both private
messaging and group chats via conversation streams.
[![CircleCI Build Status](https://circleci.com/gh/zulip/zulip.svg?style=svg)](https://circleci.com/gh/zulip/zulip)
[![Travis Build Status](https://travis-ci.org/zulip/zulip.svg?branch=master)](https://travis-ci.org/zulip/zulip)
Zulip also supports fast search, drag-and-drop file uploads, image
previews, group private messages, audible notifications,
missed-message emails, desktop apps, and much more.
Further information on the Zulip project and its features can be found
at <https://www.zulip.org>.
[![Build Status](https://travis-ci.org/zulip/zulip.svg?branch=master)](https://travis-ci.org/zulip/zulip)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip.svg)](https://codecov.io/gh/zulip/zulip)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage]
[![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)](http://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/)
[![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](http://zulip.readthedocs.io/en/latest/)
[![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org)
[![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip)
[![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](http://twitter.com/zulip)
[mypy-coverage]: https://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/
## Community
## Getting started
There are several places online where folks discuss Zulip.
Click on the appropriate link below. If nothing seems to apply,
join us on the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html)
and tell us what's up!
* The primary place is the
[Zulip development community Zulip server][czo-doc] at
chat.zulip.org.
You might be interested in:
* For Google Summer of Code students and applicants, we have
[a mailing list](https://groups.google.com/forum/#!forum/zulip-gsoc)
for help, questions, and announcements. But it's often simpler to
[visit chat.zulip.org][czo-doc] instead.
* **Contributing code**. Check out our
[guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. Zulip prides itself on maintaining a clean and
well-tested codebase, and a stock of hundreds of
[beginner-friendly issues][beginner-friendly].
* We have a [public development discussion mailing list][zulip-devel],
zulip-devel, which is currently pretty low traffic because most
discussions happen in our public Zulip instance. We use it to
announce Zulip developer community gatherings and ask for feedback on
major technical or design decisions. It has several hundred
subscribers, so you can use it to ask questions about features or
possible bugs, but please don't use it ask for generic help getting
started as a contributor (e.g. because you want to do Google Summer of
Code). The rest of this page covers how to get involved in the Zulip
project in detail.
* **Contributing non-code**.
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issue),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip
into your language,
[write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach)
for the Zulip blog, or
[give us feedback](https://zulip.readthedocs.io/en/latest/overview/contributing.html#user-feedback). We
would love to hear from you, even if you're just trying the product out.
* Zulip also has a [blog](https://blog.zulip.org/) and
[twitter account](https://twitter.com/zulip).
* **Supporting Zulip**. Advocate for your organization to use Zulip, write a
review in the mobile app stores, or
[upvote Zulip](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach) on
product comparison sites.
* Last but not least, we use [GitHub](https://github.com/zulip/zulip)
to track Zulip-related issues (and store our code, of course).
Anybody with a GitHub account should be able to create Issues there
pertaining to bugs or enhancement requests. We also use Pull Requests
as our primary mechanism to receive code contributions.
* **Checking Zulip out**. The best way to see Zulip in action is to drop by
the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We
also recommend reading Zulip for
[open source](https://zulipchat.com/for/open-source/), Zulip for
[companies](https://zulipchat.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulipchat.com/for/working-groups-and-communities/).
The Zulip community has a [Code of Conduct][code-of-conduct].
* **Running a Zulip server**. Setting up a server takes just a couple of
minutes. Zulip runs on Ubuntu 16.04 Xenial and Ubuntu 14.04 Trusty. The
installation process is
[documented here](https://zulip.readthedocs.io/en/1.7.1/prod.html).
Commercial support is available; see <https://zulipchat.com/plans> for
details.
[zulip-devel]: https://groups.google.com/forum/#!forum/zulip-devel
* **Using Zulip without setting up a server**. <https://zulipchat.com> offers
free and commercial hosting.
## Installing the Zulip Development environment
* **Applying for a Zulip internship**. Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code](https://developers.google.com/open-source/gsoc/),
and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram). Zulip
also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). More
information is available
[here](https://zulip.readthedocs.io/en/latest/overview/contributing.html#internship-programs).
The Zulip development environment is the recommended option for folks
interested in trying out Zulip, since it is very easy to install.
This is documented in [the developer installation guide][dev-install].
You may also be interested in reading our [blog](http://blog.zulip.org/) or
following us on [twitter](https://twitter.com/zulip).
Zulip is distributed under the
[Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license.
## Running Zulip in production
[beginner-friendly]: https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22
Zulip in production supports Ubuntu 16.04 Xenial and Ubuntu 14.04
Trusty. We're happy to support work to enable Zulip to run on
additional platforms. The installation process is
[documented here](https://zulip.readthedocs.io/en/latest/prod.html).
## Ways to contribute
Zulip welcomes all forms of contributions! This page documents the
Zulip development process.
* **Pull requests**. Before a pull request can be merged, you need to
sign the [Dropbox Contributor License Agreement][cla]. Also,
please skim our [commit message style guidelines][doc-commit-style].
We encourage early pull requests for work in progress. Prefix the title
of your pull request with `[WIP]` and reference it when asking for
community feedback. When you are ready for final review, remove
the `[WIP]`.
* **Testing**. The Zulip automated tests all run automatically when
you submit a pull request, but you can also run them all in your
development environment following the instructions in the [testing
docs][doc-test]. You can also try out [our new desktop
client][electron], which is in alpha; we'd appreciate testing and
[feedback](https://github.com/zulip/zulip-electron/issues/new).
* **Developer Documentation**. Zulip has a growing collection of
developer documentation on [Read The Docs][doc]. Recommended reading
for new contributors includes the [directory structure][doc-dirstruct]
and [new feature tutorial][doc-newfeat]. You can also improve
[Zulip.org][z-org].
* **Mailing lists and bug tracker**. Zulip has a [development
discussion mailing list](#community) and uses [GitHub issues
][gh-issues]. There are also lists for the [Android][email-android]
and [iOS][email-ios] apps. Feel free to send any questions or
suggestions of areas where you'd love to see more documentation to the
relevant list! Check out our [bug report guidelines][bug-report]
before submitting. Please report any security issues you discover to
zulip-security@googlegroups.com.
* **App codebases**. This repository is for the Zulip server and web
app (including most integrations). The
[beta React Native mobile app][mobile], [Java Android app][Android]
(see [our mobile strategy][mobile-strategy]),
[new Electron desktop app][electron], and
[legacy Qt-based desktop app][desktop] are all separate repositories.
* **Glue code**. We maintain a [Hubot adapter][hubot-adapter] and several
integrations ([Phabricator][phab], [Jenkins][], [Puppet][], [Redmine][],
and [Trello][]), plus [node.js API bindings][node], an [isomorphic
JavaScript library][zulip-js], and a [full-text search PostgreSQL
extension][tsearch], as separate repos.
* **Translations**. Zulip is in the process of being translated into
10+ languages, and we love contributions to our translations. See our
[translating documentation][transifex] if you're interested in
contributing!
* **Code Reviews**. Zulip is all about community and helping each
other out. Check out [#code review][code-review] on
[chat.zulip.org][czo-doc] to help review PRs and give comments on
other people's work. Everyone is welcome to participate, even those
new to Zulip! Even just checking out the code, manually testing it,
and posting on whether or not it worked is valuable.
[cla]: https://opensource.dropbox.com/cla/
[code-of-conduct]: https://zulip.readthedocs.io/en/latest/code-of-conduct.html
[dev-install]: https://zulip.readthedocs.io/en/latest/dev-overview.html
[doc]: https://zulip.readthedocs.io/
[doc-commit-style]: http://zulip.readthedocs.io/en/latest/version-control.html#commit-messages
[doc-dirstruct]: http://zulip.readthedocs.io/en/latest/directory-structure.html
[doc-newfeat]: http://zulip.readthedocs.io/en/latest/new-feature-tutorial.html
[doc-test]: http://zulip.readthedocs.io/en/latest/testing.html
[electron]: https://github.com/zulip/zulip-electron
[gh-issues]: https://github.com/zulip/zulip/issues
[desktop]: https://github.com/zulip/zulip-desktop
[android]: https://github.com/zulip/zulip-android
[mobile]: https://github.com/zulip/zulip-mobile
[mobile-strategy]: https://github.com/zulip/zulip-android/blob/master/android-strategy.md
[email-android]: https://groups.google.com/forum/#!forum/zulip-android
[email-ios]: https://groups.google.com/forum/#!forum/zulip-ios
[hubot-adapter]: https://github.com/zulip/hubot-zulip
[jenkins]: https://github.com/zulip/zulip-jenkins-plugin
[node]: https://github.com/zulip/zulip-node
[zulip-js]: https://github.com/zulip/zulip-js
[phab]: https://github.com/zulip/phabricator-to-zulip
[puppet]: https://github.com/matthewbarr/puppet-zulip
[redmine]: https://github.com/zulip/zulip-redmine-plugin
[trello]: https://github.com/zulip/trello-to-zulip
[tsearch]: https://github.com/zulip/tsearch_extras
[transifex]: https://zulip.readthedocs.io/en/latest/translating.html#testing-translations
[z-org]: https://github.com/zulip/zulip.github.io
[code-review]: https://chat.zulip.org/#narrow/stream/code.20review
[bug-report]: http://zulip.readthedocs.io/en/latest/bug-reports.html
## Google Summer of Code
We participated in
[GSoC](https://developers.google.com/open-source/gsoc/) in 2016 (with
[great results](https://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/))
and [are participating](https://github.com/zulip/zulip.github.io/blob/master/gsoc-ideas.md)
in 2017 as well.
## How to get involved with contributing to Zulip
First, subscribe to the Zulip [development discussion mailing
list](#community).
The Zulip project uses a system of labels in our [issue
tracker][gh-issues] to make it easy to find a project if you don't
have your own project idea in mind or want to get some experience with
working on Zulip before embarking on a larger project you have in
mind:
* [Integrations](https://github.com/zulip/zulip/labels/area%3A%20integrations).
Integrate Zulip with another piece of software and contribute it
back to the community! Writing an integration can be a great first
contribution. There's detailed documentation on how to write
integrations in [the Zulip integration writing
guide](https://zulip.readthedocs.io/en/latest/integration-guide.html).
* [Good first issue](https://github.com/zulip/zulip/labels/good%20first%20issue):
Smaller projects that might be a great first contribution.
* [Documentation](https://github.com/zulip/zulip/labels/area%3A%20documentation):
The Zulip project loves contributions of new documentation.
* [Help Wanted](https://github.com/zulip/zulip/labels/help%20wanted):
A broader list of projects that nobody is currently working on.
* [Platform support](https://github.com/zulip/zulip/labels/Platform%20support):
These are open issues about making it possible to install Zulip on a
wider range of platforms.
* [Bugs](https://github.com/zulip/zulip/labels/bug): Open bugs.
* [Feature requests](https://github.com/zulip/zulip/labels/enhancement):
Browsing this list can be a great way to find feature ideas to
implement that other Zulip users are excited about.
* [2016 roadmap milestone](http://zulip.readthedocs.io/en/latest/roadmap.html):
The projects that are
[priorities for the Zulip project](https://zulip.readthedocs.io/en/latest/roadmap.html).
These are great projects if you're looking to make an impact.
Another way to find issues in Zulip is to take advantage of our
`area:<foo>` convention in separating out issues. We partition all of
our issues into areas like admin, compose, emoji, hotkeys, i18n,
onboarding, search, etc. Look through our
[list of labels](https://github.com/zulip/zulip/labels), and click on
some of the `area:` labels to see all the tickets related to your
areas of interest.
If you're excited about helping with an open issue, make sure to claim
the issue by commenting the following in the comment section:
"**@zulipbot** claim". **@zulipbot** will assign you to the issue and
label the issue as **in progress**. For more details, check out
[**@zulipbot**](https://github.com/zulip/zulipbot).
You're encouraged to ask questions on how to best implement or debug
your changes -- the Zulip maintainers are excited to answer questions
to help you stay unblocked and working efficiently. It's great to ask
questions in comments on GitHub issues and pull requests, or
[on chat.zulip.org][czo-doc]. We'll direct longer discussions to
Zulip chat, but please post a summary of what you learned from the
chat, or link to the conversation, in a comment on the GitHub issue.
We also welcome suggestions of features that you feel would be
valuable or changes that you feel would make Zulip a better open
source project, and are happy to support you in adding new features or
other user experience improvements to Zulip.
If you have a new feature you'd like to add, we recommend you start by
opening a GitHub issue about the feature idea explaining the problem
that you're hoping to solve and that you're excited to work on it. A
Zulip maintainer will usually reply within a day with feedback on the
idea, notes on any important issues or concerns, and and often tips on
how to implement or test it. Please feel free to ping the thread if
you don't hear a response from the maintainers -- we try to be very
responsive so this usually means we missed your message.
For significant changes to the visual design, user experience, data
model, or architecture, we highly recommend posting a mockup,
screenshot, or description of what you have in mind to the
[#design](https://chat.zulip.org/#narrow/stream/design) stream on
[chat.zulip.org][czo-doc] to get broad feedback before you spend too
much time on implementation details.
Finally, before implementing a larger feature, we highly recommend
looking at the
[new feature tutorial](http://zulip.readthedocs.io/en/latest/new-feature-tutorial.html)
and [coding style guidelines](http://zulip.readthedocs.io/en/latest/code-style.html)
on ReadTheDocs.
Feedback on how to make this development process more efficient, fun,
and friendly to new contributors is very welcome! Just send an email
to the [zulip-devel](#community) list with your thoughts.
When you feel like you have completed your work on an issue, post your
PR to the
[#code review](https://chat.zulip.org/#narrow/stream/code.20review)
stream on [chat.zulip.org][czo-doc]. This is our lightweight process
that gives other developers the opportunity to give you comments and
suggestions on your work.
## License
Copyright 2011-2017 Dropbox, Inc., Kandra Labs, Inc., and contributors
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
The software includes some works released by third parties under other
free and open source licenses. Those works are redistributed under the
license terms under which the works were received. For more details,
see the ``docs/THIRDPARTY`` file included with this distribution.
[czo-doc]: https://zulip.readthedocs.io/en/latest/chat-zulip-org.html

103
Vagrantfile vendored
View File

@@ -10,7 +10,7 @@ end
if Vagrant::VERSION == "1.8.7" then
path = `which curl`
if path.include?('/opt/vagrant/embedded/bin/curl') then
puts "In Vagrant 1.8.7, curl is broken. Please use Vagrant 2.0.2 "\
puts "In Vagrant 1.8.7, curl is broken. Please use Vagrant 1.8.6 "\
"or run 'sudo rm -f /opt/vagrant/embedded/bin/curl' to fix the "\
"issue before provisioning. See "\
"https://github.com/mitchellh/vagrant/issues/7997 "\
@@ -19,53 +19,6 @@ if Vagrant::VERSION == "1.8.7" then
end
end
# Workaround: the lxc-config in vagrant-lxc is incompatible with changes in
# LXC 2.1.0, found in Ubuntu 17.10 artful. LXC 2.1.1 (in 18.04 LTS bionic)
# ignores the old config key, so this will only be needed for artful.
#
# vagrant-lxc upstream has an attempted fix:
# https://github.com/fgrehm/vagrant-lxc/issues/445
# but it didn't work in our testing. This is a temporary issue, so we just
# hack in a fix: we patch the skeleton `lxc-config` file right in the
# distribution of the vagrant-lxc "box" we use. If the user doesn't yet
# have the box (e.g. on first setup), Vagrant would download it but too
# late for us to patch it like this; so we prompt them to explicitly add it
# first and then rerun.
if ['up', 'provision'].include? ARGV[0]
if command? "lxc-ls"
LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if LXC_VERSION == "2.1.0"
lxc_config_file = ENV['HOME'] + "/.vagrant.d/boxes/fgrehm-VAGRANTSLASH-trusty64-lxc/1.2.0/lxc/lxc-config"
if File.file?(lxc_config_file)
lines = File.readlines(lxc_config_file)
deprecated_line = "lxc.pivotdir = lxc_putold\n"
if lines[1] == deprecated_line
lines[1] = "# #{deprecated_line}"
File.open(lxc_config_file, 'w') do |f|
f.puts(lines)
end
end
else
puts 'You are running LXC 2.1.0, and fgrehm/trusty64-lxc box is incompatible '\
"with it by default. First add the box by doing:\n"\
" vagrant box add https://vagrantcloud.com/fgrehm/trusty64-lxc\n"\
'Once this command succeeds, do "vagrant up" again.'
exit
end
end
end
end
# Workaround: Vagrant removed the atlas.hashicorp.com to
# vagrantcloud.com redirect in February 2018. The value of
# DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is
# atlas.hashicorp.com, which means that removal broke the fetching and
# updating of boxes (since the old URL doesn't work). See
# https://github.com/hashicorp/vagrant/issues/9442
if Vagrant::DEFAULT_SERVER_URL == "atlas.hashicorp.com"
Vagrant::DEFAULT_SERVER_URL.replace('https://vagrantcloud.com')
end
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
# For LXC. VirtualBox hosts use a different box, described below.
@@ -77,13 +30,7 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
host_ip_addr = "127.0.0.1"
config.vm.synced_folder ".", "/vagrant", disabled: true
if (/darwin/ =~ RUBY_PLATFORM) != nil
config.vm.synced_folder ".", "/srv/zulip", type: "nfs",
linux__nfs_options: ['rw']
config.vm.network "private_network", type: "dhcp"
else
config.vm.synced_folder ".", "/srv/zulip"
end
config.vm.synced_folder ".", "/srv/zulip"
vagrant_config_file = ENV['HOME'] + "/.zulip-vagrant-config"
if File.file?(vagrant_config_file)
@@ -153,29 +100,10 @@ $provision_script = <<SCRIPT
set -x
set -e
set -o pipefail
# Code should go here, rather than tools/provision, only if it is
# something that we don't want to happen when running provision in a
# development environment not using Vagrant.
# Set the MOTD on the system to have Zulip instructions
sudo rm -f /etc/update-motd.d/*
sudo bash -c 'cat << EndOfMessage > /etc/motd
Welcome to the Zulip development environment! Popular commands:
* tools/provision - Update the development environment
* tools/run-dev.py - Run the development server
* tools/lint - Run the linter (quick and catches many problmes)
* tools/test-* - Run tests (use --help to learn about options)
Read https://zulip.readthedocs.io/en/latest/testing/testing.html to learn
how to run individual test suites so that you can get a fast debug cycle.
EndOfMessage'
# If the host is running SELinux remount the /sys/fs/selinux directory as read only,
# needed for apt-get to work.
if [ -d "/sys/fs/selinux" ]; then
sudo mount -o remount,ro /sys/fs/selinux
sudo mount -o remount,ro /sys/fs/selinux
fi
# Set default locale, this prevents errors if the user has another locale set.
@@ -183,35 +111,14 @@ if ! grep -q 'LC_ALL=en_US.UTF-8' /etc/default/locale; then
echo "LC_ALL=en_US.UTF-8" | sudo tee -a /etc/default/locale
fi
# Set an environment variable, so that we won't print the virtualenv
# shell warning (it'll be wrong, since the shell is dying anyway)
export SKIP_VENV_SHELL_WARNING=1
# End `set -x`, so that the end of provision doesn't look like an error
# message after a successful run.
set +x
# Check if the zulip directory is writable
if [ ! -w /srv/zulip ]; then
echo "The vagrant user is unable to write to the zulip directory."
echo "To fix this, run the following commands on the host machine:"
# sudo is required since our uid is not 1000
echo ' vagrant halt -f'
echo ' rm -rf /PATH/TO/ZULIP/CLONE/.vagrant'
echo ' sudo chown -R 1000:$(whoami) /PATH/TO/ZULIP/CLONE'
echo "Replace /PATH/TO/ZULIP/CLONE with the path to where zulip code is cloned."
echo "You can resume setting up your vagrant environment by running:"
echo " vagrant up"
exit 1
fi
# Provision the development environment
ln -nsf /srv/zulip ~/zulip
/srv/zulip/tools/provision
# Run any custom provision hooks the user has configured
if [ -f /srv/zulip/tools/custom_provision ]; then
chmod +x /srv/zulip/tools/custom_provision
/srv/zulip/tools/custom_provision
chmod +x /srv/zulip/tools/custom_provision
/srv/zulip/tools/custom_provision
fi
SCRIPT

View File

@@ -1,40 +1,38 @@
import time
from collections import OrderedDict, defaultdict
from datetime import datetime, timedelta
import logging
from typing import Any, Callable, Dict, List, \
Optional, Text, Tuple, Type, Union
from django.conf import settings
from django.db import connection, models
from django.db.models import F
from analytics.models import Anomaly, BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch, last_successful_fill
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, floor_to_hour, verify_UTC
from zerver.models import Message, Realm, RealmAuditLog, \
Stream, UserActivityInterval, UserProfile, models
from analytics.models import InstallationCount, RealmCount, \
UserCount, StreamCount, BaseCount, FillState, Anomaly, installation_epoch, \
last_successful_fill
from zerver.models import Realm, UserProfile, Message, Stream, \
UserActivityInterval, RealmAuditLog, models
from zerver.lib.timestamp import floor_to_day, floor_to_hour, ceiling_to_day, \
ceiling_to_hour, verify_UTC
from typing import Any, Callable, Dict, List, Optional, Text, Tuple, Type, Union
from collections import defaultdict, OrderedDict
from datetime import timedelta, datetime
from zerver.lib.logging_util import create_logger
import time
## Logging setup ##
logger = logging.getLogger('zulip.management')
log_to_file(logger, settings.ANALYTICS_LOG_PATH)
logger = create_logger('zulip.management', settings.ANALYTICS_LOG_PATH, 'INFO')
# You can't subtract timedelta.max from a datetime, so use this instead
TIMEDELTA_MAX = timedelta(days=365*1000)
## Class definitions ##
class CountStat:
class CountStat(object):
HOUR = 'hour'
DAY = 'day'
FREQUENCIES = frozenset([HOUR, DAY])
def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
interval: Optional[timedelta]=None) -> None:
def __init__(self, property, data_collector, frequency, interval=None):
# type: (str, DataCollector, str, Optional[timedelta]) -> None
self.property = property
self.data_collector = data_collector
# might have to do something different for bitfields
@@ -48,28 +46,31 @@ class CountStat:
else: # frequency == CountStat.DAY
self.interval = timedelta(days=1)
def __str__(self) -> Text:
return "<CountStat: %s>" % (self.property,)
def __unicode__(self):
# type: () -> Text
return u"<CountStat: %s>" % (self.property,)
class LoggingCountStat(CountStat):
def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None:
def __init__(self, property, output_table, frequency):
# type: (str, Type[BaseCount], str) -> None
CountStat.__init__(self, property, DataCollector(output_table, None), frequency)
class DependentCountStat(CountStat):
def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
interval: Optional[timedelta]=None, dependencies: List[str]=[]) -> None:
def __init__(self, property, data_collector, frequency, interval=None, dependencies=[]):
# type: (str, DataCollector, str, Optional[timedelta], List[str]) -> None
CountStat.__init__(self, property, data_collector, frequency, interval=interval)
self.dependencies = dependencies
class DataCollector:
def __init__(self, output_table: Type[BaseCount],
pull_function: Optional[Callable[[str, datetime, datetime], int]]) -> None:
class DataCollector(object):
def __init__(self, output_table, pull_function):
# type: (Type[BaseCount], Optional[Callable[[str, datetime, datetime], int]]) -> None
self.output_table = output_table
self.pull_function = pull_function
## CountStat-level operations ##
def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
def process_count_stat(stat, fill_to_time):
# type: (CountStat, datetime) -> None
if stat.frequency == CountStat.HOUR:
time_increment = timedelta(hours=1)
elif stat.frequency == CountStat.DAY:
@@ -119,14 +120,16 @@ def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
currently_filled = currently_filled + time_increment
logger.info("DONE %s (%dms)" % (stat.property, (end-start)*1000))
def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None:
def do_update_fill_state(fill_state, end_time, state):
# type: (FillState, datetime, int) -> None
fill_state.end_time = end_time
fill_state.state = state
fill_state.save()
# We assume end_time is valid (e.g. is on a day or hour boundary as appropriate)
# and is timezone aware. It is the caller's responsibility to enforce this!
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
def do_fill_count_stat_at_hour(stat, end_time):
# type: (CountStat, datetime) -> None
start_time = end_time - stat.interval
if not isinstance(stat, LoggingCountStat):
timer = time.time()
@@ -136,7 +139,8 @@ def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
(stat.property, (time.time()-timer)*1000, rows_added))
do_aggregate_to_summary_table(stat, end_time)
def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
def do_delete_counts_at_hour(stat, end_time):
# type: (CountStat, datetime) -> None
if isinstance(stat, LoggingCountStat):
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
if stat.data_collector.output_table in [UserCount, StreamCount]:
@@ -147,7 +151,8 @@ def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
RealmCount.objects.filter(property=stat.property, end_time=end_time).delete()
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
def do_aggregate_to_summary_table(stat, end_time):
# type: (CountStat, datetime) -> None
cursor = connection.cursor()
# Aggregate into RealmCount
@@ -172,8 +177,7 @@ def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
start = time.time()
cursor.execute(realmcount_query, {'end_time': end_time})
end = time.time()
logger.info("%s RealmCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
logger.info("%s RealmCount aggregation (%dms/%sr)" % (stat.property, (end-start)*1000, cursor.rowcount))
# Aggregate into InstallationCount
installationcount_query = """
@@ -190,16 +194,14 @@ def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
start = time.time()
cursor.execute(installationcount_query, {'end_time': end_time})
end = time.time()
logger.info("%s InstallationCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
logger.info("%s InstallationCount aggregation (%dms/%sr)" % (stat.property, (end-start)*1000, cursor.rowcount))
cursor.close()
## Utility functions called from outside counts.py ##
# called from zerver/lib/actions.py; should not throw any errors
def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat,
subgroup: Optional[Union[str, int, bool]], event_time: datetime,
increment: int=1) -> None:
def do_increment_logging_stat(zerver_object, stat, subgroup, event_time, increment=1):
# type: (Union[Realm, UserProfile, Stream], CountStat, Optional[Union[str, int, bool]], datetime, int) -> None
table = stat.data_collector.output_table
if table == RealmCount:
id_args = {'realm': zerver_object}
@@ -220,7 +222,8 @@ def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream],
row.value = F('value') + increment
row.save(update_fields=['value'])
def do_drop_all_analytics_tables() -> None:
def do_drop_all_analytics_tables():
# type: () -> None
UserCount.objects.all().delete()
StreamCount.objects.all().delete()
RealmCount.objects.all().delete()
@@ -228,7 +231,8 @@ def do_drop_all_analytics_tables() -> None:
FillState.objects.all().delete()
Anomaly.objects.all().delete()
def do_drop_single_stat(property: str) -> None:
def do_drop_single_stat(property):
# type: (str) -> None
UserCount.objects.filter(property=property).delete()
StreamCount.objects.filter(property=property).delete()
RealmCount.objects.filter(property=property).delete()
@@ -237,8 +241,8 @@ def do_drop_single_stat(property: str) -> None:
## DataCollector-level operations ##
def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime, query: str,
group_by: Optional[Tuple[models.Model, str]]) -> int:
def do_pull_by_sql_query(property, start_time, end_time, query, group_by):
# type: (str, datetime, datetime, str, Optional[Tuple[models.Model, str]]) -> int
if group_by is None:
subgroup = 'NULL'
group_by_clause = ''
@@ -258,13 +262,15 @@ def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime
cursor.close()
return rowcount
def sql_data_collector(output_table: Type[BaseCount], query: str,
group_by: Optional[Tuple[models.Model, str]]) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime) -> int:
def sql_data_collector(output_table, query, group_by):
# type: (Type[BaseCount], str, Optional[Tuple[models.Model, str]]) -> DataCollector
def pull_function(property, start_time, end_time):
# type: (str, datetime, datetime) -> int
return do_pull_by_sql_query(property, start_time, end_time, query, group_by)
return DataCollector(output_table, pull_function)
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime) -> int:
def do_pull_minutes_active(property, start_time, end_time):
# type: (str, datetime, datetime) -> int
user_activity_intervals = UserActivityInterval.objects.filter(
end__gt=start_time, start__lt=end_time
).select_related(
@@ -288,8 +294,7 @@ count_message_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, count(*),
'%(property)s', %(subgroup)s, %%(time_end)s
zerver_userprofile.id, zerver_userprofile.realm_id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile
JOIN zerver_message
ON
@@ -331,9 +336,7 @@ count_message_type_by_user_query = """
LEFT JOIN zerver_stream
ON
zerver_recipient.type_id = zerver_stream.id
GROUP BY
zerver_userprofile.realm_id, zerver_userprofile.id,
zerver_recipient.type, zerver_stream.invite_only
GROUP BY zerver_userprofile.realm_id, zerver_userprofile.id, zerver_recipient.type, zerver_stream.invite_only
) AS subquery
GROUP BY realm_id, id, message_type
"""
@@ -518,11 +521,6 @@ count_stats_ = [
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY),
# Rate limiting stats
# Used to limit the number of invitation emails sent by a realm
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
# Dependent stats
# Must come after their dependencies.

View File

@@ -1,14 +1,19 @@
from zerver.models import Realm, UserProfile, Stream, Message
from analytics.models import InstallationCount, RealmCount, UserCount, StreamCount
from analytics.lib.counts import CountStat
from analytics.lib.time_utils import time_range
from datetime import datetime
from math import sqrt
from random import gauss, random, seed
from typing import List
from analytics.lib.counts import CountStat
from six.moves import zip
def generate_time_series_data(days: int=100, business_hours_base: float=10,
non_business_hours_base: float=10, growth: float=1,
autocorrelation: float=0, spikiness: float=1,
holiday_rate: float=0, frequency: str=CountStat.DAY,
partial_sum: bool=False, random_seed: int=26) -> List[int]:
def generate_time_series_data(days=100, business_hours_base=10, non_business_hours_base=10,
growth=1, autocorrelation=0, spikiness=1, holiday_rate=0,
frequency=CountStat.DAY, partial_sum=False, random_seed=26):
# type: (int, float, float, float, float, float, float, str, bool, int) -> List[int]
"""
Generate semi-realistic looking time series data for testing analytics graphs.

View File

@@ -1,15 +1,16 @@
from zerver.lib.timestamp import floor_to_hour, floor_to_day, \
timestamp_to_datetime, verify_UTC
from analytics.lib.counts import CountStat
from datetime import datetime, timedelta
from typing import List, Optional
from analytics.lib.counts import CountStat
from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC
# If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive.
# If min_length is greater than 0, pads the list to the left.
# So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22],
# and time_range(Sep 20, Sep 22, day, 5) returns [Sep 18, Sep 19, Sep 20, Sep 21, Sep 22]
def time_range(start: datetime, end: datetime, frequency: str,
min_length: Optional[int]) -> List[datetime]:
def time_range(start, end, frequency, min_length):
# type: (datetime, datetime, str, Optional[int]) -> List[datetime]
verify_UTC(start)
verify_UTC(end)
if frequency == CountStat.HOUR:

View File

@@ -1,14 +1,14 @@
import datetime
import logging
import time
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from zerver.models import Recipient, Message
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Message, Recipient
import datetime
import time
import logging
def compute_stats(log_level: int) -> None:
def compute_stats(log_level):
# type: (int) -> None
logger = logging.getLogger()
logger.setLevel(log_level)
@@ -71,10 +71,12 @@ def compute_stats(log_level: int) -> None:
class Command(BaseCommand):
help = "Compute statistics on MIT Zephyr usage."
def add_arguments(self, parser: CommandParser) -> None:
def add_arguments(self, parser):
# type: (CommandParser) -> None
parser.add_argument('--verbose', default=False, action='store_true')
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
level = logging.INFO
if options["verbose"]:
level = logging.DEBUG

View File

@@ -1,13 +1,14 @@
import datetime
from typing import Any, Dict
from zerver.lib.statistics import seconds_usage_between
from django.core.management.base import BaseCommand, CommandParser
from zerver.models import UserProfile
import datetime
from django.utils.timezone import utc
from zerver.lib.statistics import seconds_usage_between
from zerver.models import UserProfile
def analyze_activity(options: Dict[str, Any]) -> None:
def analyze_activity(options):
# type: (Dict[str, Any]) -> None
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=utc)
day_end = day_start + datetime.timedelta(days=options["duration"])
@@ -46,11 +47,13 @@ Usage: ./manage.py analyze_user_activity [--realm=zulip] [--date=2013-09-10] [--
By default, if no date is selected 2013-09-10 is used. If no realm is provided, information
is shown for all realms"""
def add_arguments(self, parser: CommandParser) -> None:
def add_arguments(self, parser):
# type: (CommandParser) -> None
parser.add_argument('--realm', action='store')
parser.add_argument('--date', action='store', default="2013-09-06")
parser.add_argument('--duration', action='store', default=1, type=int,
help="How many days to show usage information for")
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
analyze_activity(options)

View File

@@ -1,85 +0,0 @@
from argparse import ArgumentParser
from datetime import timedelta
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.models import InstallationCount, installation_epoch, \
last_successful_fill
from analytics.lib.counts import COUNT_STATS, CountStat
from zerver.lib.timestamp import floor_to_hour, floor_to_day, verify_UTC, \
TimezoneNotUTCException
from zerver.models import Realm
import os
import subprocess
import sys
import time
from typing import Any, Dict
states = {
0: "OK",
1: "WARNING",
2: "CRITICAL",
3: "UNKNOWN"
}
class Command(BaseCommand):
help = """Checks FillState table.
Run as a cron job that runs every hour."""
def handle(self, *args: Any, **options: Any) -> None:
fill_state = self.get_fill_state()
status = fill_state['status']
message = fill_state['message']
state_file_path = "/var/lib/nagios_state/check-analytics-state"
state_file_tmp = state_file_path + "-tmp"
with open(state_file_tmp, "w") as f:
f.write("%s|%s|%s|%s\n" % (
int(time.time()), status, states[status], message))
subprocess.check_call(["mv", state_file_tmp, state_file_path])
def get_fill_state(self) -> Dict[str, Any]:
if not Realm.objects.exists():
return {'status': 0, 'message': 'No realms exist, so not checking FillState.'}
warning_unfilled_properties = []
critical_unfilled_properties = []
for property, stat in COUNT_STATS.items():
last_fill = last_successful_fill(property)
if last_fill is None:
last_fill = installation_epoch()
try:
verify_UTC(last_fill)
except TimezoneNotUTCException:
return {'status': 2, 'message': 'FillState not in UTC for %s' % (property,)}
if stat.frequency == CountStat.DAY:
floor_function = floor_to_day
warning_threshold = timedelta(hours=26)
critical_threshold = timedelta(hours=50)
else: # CountStat.HOUR
floor_function = floor_to_hour
warning_threshold = timedelta(minutes=90)
critical_threshold = timedelta(minutes=150)
if floor_function(last_fill) != last_fill:
return {'status': 2, 'message': 'FillState not on %s boundary for %s' %
(stat.frequency, property)}
time_to_last_fill = timezone_now() - last_fill
if time_to_last_fill > critical_threshold:
critical_unfilled_properties.append(property)
elif time_to_last_fill > warning_threshold:
warning_unfilled_properties.append(property)
if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0:
return {'status': 0, 'message': 'FillState looks fine.'}
if len(critical_unfilled_properties) == 0:
return {'status': 1, 'message': 'Missed filling %s once.' %
(', '.join(warning_unfilled_properties),)}
return {'status': 2, 'message': 'Missed filling %s once. Missed filling %s at least twice.' %
(', '.join(warning_unfilled_properties), ', '.join(critical_unfilled_properties))}

View File

@@ -1,20 +1,24 @@
import sys
from argparse import ArgumentParser
from typing import Any
from argparse import ArgumentParser
from django.db import connection
from django.core.management.base import BaseCommand
from analytics.lib.counts import do_drop_all_analytics_tables
from typing import Any
class Command(BaseCommand):
help = """Clear analytics tables."""
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('--force',
action='store_true',
help="Clear analytics tables.")
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
if options['force']:
do_drop_all_analytics_tables()
else:

View File

@@ -1,15 +1,18 @@
import sys
from argparse import ArgumentParser
from typing import Any
from argparse import ArgumentParser
from django.db import connection
from django.core.management.base import BaseCommand
from analytics.lib.counts import COUNT_STATS, do_drop_single_stat
from analytics.lib.counts import do_drop_single_stat, COUNT_STATS
from typing import Any
class Command(BaseCommand):
help = """Clear analytics tables."""
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('--force',
action='store_true',
help="Actually do it.")
@@ -17,7 +20,8 @@ class Command(BaseCommand):
type=str,
help="The property of the stat to be cleared.")
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
property = options['property']
if property not in COUNT_STATS:
print("Invalid property: %s" % (property,))

View File

@@ -1,13 +1,14 @@
import datetime
from argparse import ArgumentParser
from typing import Any
from argparse import ArgumentParser
from django.db.models import Count, QuerySet
from django.utils.timezone import now as timezone_now
from zerver.lib.management import ZulipBaseCommand
from zerver.models import UserActivity
import datetime
class Command(ZulipBaseCommand):
help = """Report rough client activity globally, for a realm, or for a user
@@ -17,16 +18,18 @@ Usage examples:
./manage.py client_activity --target realm --realm zulip
./manage.py client_activity --target user --user hamlet@zulip.com --realm zulip"""
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('--target', dest='target', required=True, type=str,
help="'server' will calculate client activity of the entire server. "
"'realm' will calculate client activity of realm. "
"'user' will calculate client activity of the user.")
parser.add_argument('--user', dest='user', type=str,
help="The email address of the user you want to calculate activity.")
help="The email adress of the user you want to calculate activity.")
self.add_realm_args(parser)
def compute_activity(self, user_activity_objects: QuerySet) -> None:
def compute_activity(self, user_activity_objects):
# type: (QuerySet) -> None
# Report data from the past week.
#
# This is a rough report of client activity because we inconsistently
@@ -56,7 +59,8 @@ Usage examples:
print("%25s %15d" % (count[1], count[0]))
print("Total:", total)
def handle(self, *args: Any, **options: str) -> None:
def handle(self, *args, **options):
# type: (*Any, **str) -> None
realm = self.get_realm(options)
if options["user"] is None:
if options["target"] == "server" and realm is None:

View File

@@ -1,18 +1,22 @@
from datetime import datetime, timedelta
from typing import Any, Dict, List, Mapping, Optional, Text, Type, Union
from argparse import ArgumentParser
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, \
CountStat, do_drop_all_analytics_tables
from analytics.lib.counts import COUNT_STATS, CountStat, do_drop_all_analytics_tables
from analytics.lib.fixtures import generate_time_series_data
from analytics.lib.time_utils import time_range
from analytics.models import BaseCount, FillState, RealmCount, UserCount, StreamCount
from analytics.models import BaseCount, InstallationCount, RealmCount, \
UserCount, StreamCount, FillState
from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, UserProfile, Stream, Message, Client, \
RealmAuditLog, Recipient
RealmAuditLog
from datetime import datetime, timedelta
from six.moves import zip
from typing import Any, Dict, List, Optional, Text, Type, Union, Mapping
class Command(BaseCommand):
help = """Populates analytics tables with randomly generated data."""
@@ -20,11 +24,8 @@ class Command(BaseCommand):
DAYS_OF_DATA = 100
random_seed = 26
def create_user(self, email: Text,
full_name: Text,
is_staff: bool,
date_joined: datetime,
realm: Realm) -> UserProfile:
def create_user(self, email, full_name, is_staff, date_joined, realm):
# type: (Text, Text, Text, bool, datetime, Realm) -> UserProfile
user = UserProfile.objects.create(
email=email, full_name=full_name, is_staff=is_staff,
realm=realm, short_name=full_name, pointer=-1, last_pointer_updater='none',
@@ -34,10 +35,10 @@ class Command(BaseCommand):
event_time=user.date_joined)
return user
def generate_fixture_data(self, stat: CountStat, business_hours_base: float,
non_business_hours_base: float, growth: float,
autocorrelation: float, spikiness: float,
holiday_rate: float=0, partial_sum: bool=False) -> List[int]:
def generate_fixture_data(self, stat, business_hours_base, non_business_hours_base,
growth, autocorrelation, spikiness, holiday_rate=0,
partial_sum=False):
# type: (CountStat, float, float, float, float, float, float, bool) -> List[int]
self.random_seed += 1
return generate_time_series_data(
days=self.DAYS_OF_DATA, business_hours_base=business_hours_base,
@@ -45,7 +46,8 @@ class Command(BaseCommand):
autocorrelation=autocorrelation, spikiness=spikiness, holiday_rate=holiday_rate,
frequency=stat.frequency, partial_sum=partial_sum, random_seed=self.random_seed)
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
do_drop_all_analytics_tables()
# I believe this also deletes any objects with this realm as a foreign key
Realm.objects.filter(string_id='analytics').delete()
@@ -55,22 +57,15 @@ class Command(BaseCommand):
realm = Realm.objects.create(
string_id='analytics', name='Analytics', date_created=installation_time)
shylock = self.create_user('shylock@analytics.ds', 'Shylock', True, installation_time, realm)
stream = Stream.objects.create(
name='all', realm=realm, date_created=installation_time)
Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
def insert_fixture_data(stat: CountStat,
fixture_data: Mapping[Optional[str], List[int]],
table: Type[BaseCount]) -> None:
def insert_fixture_data(stat, fixture_data, table):
# type: (CountStat, Mapping[Optional[str], List[int]], Type[BaseCount]) -> None
end_times = time_range(last_end_time, last_end_time, stat.frequency,
len(list(fixture_data.values())[0]))
if table == RealmCount:
id_args = {'realm': realm}
if table == UserCount:
id_args = {'realm': realm, 'user': shylock}
if table == StreamCount:
id_args = {'stream': stream, 'realm': realm}
for subgroup, values in fixture_data.items():
table.objects.bulk_create([
table(property=stat.property, subgroup=subgroup, end_time=end_time,
@@ -139,12 +134,4 @@ class Command(BaseCommand):
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['messages_in_stream:is_bot:day']
realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4),
'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)}
insert_fixture_data(stat, realm_data, RealmCount)
stream_data = {'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2)} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, stream_data, StreamCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
# TODO: messages_sent_to_stream:is_bot

View File

@@ -1,14 +1,15 @@
import datetime
from argparse import ArgumentParser
from typing import Any, List
from argparse import ArgumentParser
import datetime
import pytz
from django.core.management.base import BaseCommand
from django.db.models import Count
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Recipient, Stream, \
Subscription, UserActivity, UserMessage, UserProfile, get_realm
from zerver.models import UserProfile, Realm, Stream, Message, Recipient, UserActivity, \
Subscription, UserMessage, get_realm
MOBILE_CLIENT_LIST = ["Android", "ios"]
HUMAN_CLIENT_LIST = MOBILE_CLIENT_LIST + ["website"]
@@ -18,11 +19,13 @@ human_messages = Message.objects.filter(sending_client__name__in=HUMAN_CLIENT_LI
class Command(BaseCommand):
help = "Generate statistics on realm activity."
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def active_users(self, realm: Realm) -> List[UserProfile]:
def active_users(self, realm):
# type: (Realm) -> List[UserProfile]
# Has been active (on the website, for now) in the last 7 days.
activity_cutoff = timezone_now() - datetime.timedelta(days=7)
return [activity.user_profile for activity in (
@@ -32,44 +35,53 @@ class Command(BaseCommand):
query="/json/users/me/pointer",
client__name="website"))]
def messages_sent_by(self, user: UserProfile, days_ago: int) -> int:
def messages_sent_by(self, user, days_ago):
# type: (UserProfile, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender=user, pub_date__gt=sent_time_cutoff).count()
def total_messages(self, realm: Realm, days_ago: int) -> int:
def total_messages(self, realm, days_ago):
# type: (Realm, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return Message.objects.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def human_messages(self, realm: Realm, days_ago: int) -> int:
def human_messages(self, realm, days_ago):
# type: (Realm, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def api_messages(self, realm: Realm, days_ago: int) -> int:
def api_messages(self, realm, days_ago):
# type: (Realm, int) -> int
return (self.total_messages(realm, days_ago) - self.human_messages(realm, days_ago))
def stream_messages(self, realm: Realm, days_ago: int) -> int:
def stream_messages(self, realm, days_ago):
# type: (Realm, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff,
recipient__type=Recipient.STREAM).count()
def private_messages(self, realm: Realm, days_ago: int) -> int:
def private_messages(self, realm, days_ago):
# type: (Realm, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.HUDDLE).count()
def group_private_messages(self, realm: Realm, days_ago: int) -> int:
def group_private_messages(self, realm, days_ago):
# type: (Realm, int) -> int
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.PERSONAL).count()
def report_percentage(self, numerator: float, denominator: float, text: str) -> None:
def report_percentage(self, numerator, denominator, text):
# type: (float, float, str) -> None
if not denominator:
fraction = 0.0
else:
fraction = numerator / float(denominator)
print("%.2f%% of" % (fraction * 100,), text)
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]

View File

@@ -1,20 +1,20 @@
from argparse import ArgumentParser
from typing import Any
from argparse import ArgumentParser
from django.core.management.base import BaseCommand
from django.db.models import Q
from zerver.models import Message, Realm, \
Recipient, Stream, Subscription, get_realm
from zerver.models import Realm, Stream, Message, Subscription, Recipient, get_realm
class Command(BaseCommand):
help = "Generate statistics on the streams for a realm."
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def handle(self, *args: Any, **options: str) -> None:
def handle(self, *args, **options):
# type: (*Any, **str) -> None
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
@@ -36,8 +36,7 @@ class Command(BaseCommand):
continue
print("%25s" % (stream.name,), end=' ')
recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id)
print("%10d" % (len(Subscription.objects.filter(recipient=recipient,
active=True)),), end=' ')
print("%10d" % (len(Subscription.objects.filter(recipient=recipient, active=True)),), end=' ')
num_messages = len(Message.objects.filter(recipient=recipient))
print("%12d" % (num_messages,))
print("%d invite-only streams" % (invite_only_count,))

View File

@@ -1,29 +1,34 @@
import os
import time
from argparse import ArgumentParser
from typing import Any, Dict
import sys
from scripts.lib.zulip_tools import ENDC, WARNING
from argparse import ArgumentParser
from datetime import timedelta
import time
from django.conf import settings
from django.core.management.base import BaseCommand
from django.utils.dateparse import parse_datetime
from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from django.utils.dateparse import parse_datetime
from django.conf import settings
from analytics.models import RealmCount, UserCount
from analytics.lib.counts import COUNT_STATS, logger, process_count_stat
from scripts.lib.zulip_tools import ENDC, WARNING
from zerver.lib.timestamp import floor_to_hour
from zerver.models import Realm
from zerver.models import UserProfile, Message, Realm
from typing import Any, Dict
class Command(BaseCommand):
help = """Fills Analytics tables.
Run as a cron job that runs every hour."""
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('--time', '-t',
type=str,
help='Update stat tables from current state to'
'--time. Defaults to the current time.',
help='Update stat tables from current state to --time. Defaults to the current time.',
default=timezone_now().isoformat())
parser.add_argument('--utc',
action='store_true',
@@ -37,7 +42,8 @@ class Command(BaseCommand):
help="Print timing information to stdout.",
default=False)
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
try:
os.mkdir(settings.ANALYTICS_LOCK_DIR)
except OSError:
@@ -49,7 +55,8 @@ class Command(BaseCommand):
finally:
os.rmdir(settings.ANALYTICS_LOCK_DIR)
def run_update_analytics_counts(self, options: Dict[str, Any]) -> None:
def run_update_analytics_counts(self, options):
# type: (Dict[str, Any]) -> None
# installation_epoch relies on there being at least one realm; we
# shouldn't run the analytics code if that condition isn't satisfied
if not Realm.objects.exists():

View File

@@ -1,25 +1,29 @@
import datetime
from argparse import ArgumentParser
import datetime
import pytz
from typing import Any
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Stream, UserProfile, get_realm
from zerver.models import UserProfile, Realm, Stream, Message, get_realm
class Command(BaseCommand):
help = "Generate statistics on user activity."
def add_arguments(self, parser: ArgumentParser) -> None:
def add_arguments(self, parser):
# type: (ArgumentParser) -> None
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def messages_sent_by(self, user: UserProfile, week: int) -> int:
def messages_sent_by(self, user, week):
# type: (UserProfile, int) -> int
start = timezone_now() - datetime.timedelta(days=(week + 1)*7)
end = timezone_now() - datetime.timedelta(days=week*7)
return Message.objects.filter(sender=user, pub_date__gt=start, pub_date__lte=end).count()
def handle(self, *args: Any, **options: Any) -> None:
def handle(self, *args, **options):
# type: (*Any, **Any) -> None
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]

View File

@@ -1,10 +1,10 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -3,6 +3,7 @@
from django.conf import settings
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [

View File

@@ -2,6 +2,7 @@
# Generated by Django 1.10.5 on 2017-02-01 22:28
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,9 +1,11 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db import migrations
def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
def delete_messages_sent_to_stream_stat(apps, schema_editor):
# type: (StateApps, DatabaseSchemaEditor) -> None
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model('analytics', 'RealmCount')

View File

@@ -1,9 +1,10 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db import migrations
def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
def clear_message_sent_by_message_type_values(apps, schema_editor):
# type: (StateApps, DatabaseSchemaEditor) -> None
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model('analytics', 'RealmCount')

View File

@@ -1,9 +1,11 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db import migrations
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
def clear_analytics_tables(apps, schema_editor):
# type: (StateApps, DatabaseSchemaEditor) -> None
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model('analytics', 'RealmCount')

View File

@@ -1,36 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 08:14
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('analytics', '0011_clear_analytics_tables'),
]
operations = [
migrations.AlterField(
model_name='installationcount',
name='anomaly',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
),
migrations.AlterField(
model_name='realmcount',
name='anomaly',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
),
migrations.AlterField(
model_name='streamcount',
name='anomaly',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
),
migrations.AlterField(
model_name='usercount',
name='anomaly',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
),
]

View File

@@ -1,12 +1,14 @@
import datetime
from typing import Any, Dict, Optional, Text, Tuple, Union
from django.db import models
from zerver.models import Realm, UserProfile, Stream, Recipient
from zerver.lib.str_utils import ModelReprMixin
from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, Recipient, Stream, UserProfile
class FillState(models.Model):
import datetime
from typing import Optional, Tuple, Union, Dict, Any, Text
class FillState(ModelReprMixin, models.Model):
property = models.CharField(max_length=40, unique=True) # type: Text
end_time = models.DateTimeField() # type: datetime.datetime
@@ -17,16 +19,19 @@ class FillState(models.Model):
last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
def __str__(self) -> Text:
return "<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
def __unicode__(self):
# type: () -> Text
return u"<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
# The earliest/starting end_time in FillState
# We assume there is at least one realm
def installation_epoch() -> datetime.datetime:
def installation_epoch():
# type: () -> datetime.datetime
earliest_realm_creation = Realm.objects.aggregate(models.Min('date_created'))['date_created__min']
return floor_to_day(earliest_realm_creation)
def last_successful_fill(property: str) -> Optional[datetime.datetime]:
def last_successful_fill(property):
# type: (str) -> Optional[datetime.datetime]
fillstate = FillState.objects.filter(property=property).first()
if fillstate is None:
return None
@@ -35,13 +40,14 @@ def last_successful_fill(property: str) -> Optional[datetime.datetime]:
return fillstate.end_time - datetime.timedelta(hours=1)
# would only ever make entries here by hand
class Anomaly(models.Model):
class Anomaly(ModelReprMixin, models.Model):
info = models.CharField(max_length=1000) # type: Text
def __str__(self) -> Text:
return "<Anomaly: %s... %s>" % (self.info, self.id)
def __unicode__(self):
# type: () -> Text
return u"<Anomaly: %s... %s>" % (self.info, self.id)
class BaseCount(models.Model):
class BaseCount(ModelReprMixin, models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged.
@@ -49,52 +55,55 @@ class BaseCount(models.Model):
subgroup = models.CharField(max_length=16, null=True) # type: Optional[Text]
end_time = models.DateTimeField() # type: datetime.datetime
value = models.BigIntegerField() # type: int
anomaly = models.ForeignKey(Anomaly, on_delete=models.SET_NULL, null=True) # type: Optional[Anomaly]
anomaly = models.ForeignKey(Anomaly, null=True) # type: Optional[Anomaly]
class Meta:
class Meta(object):
abstract = True
class InstallationCount(BaseCount):
class Meta:
class Meta(object):
unique_together = ("property", "subgroup", "end_time")
def __str__(self) -> Text:
return "<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
def __unicode__(self):
# type: () -> Text
return u"<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
class RealmCount(BaseCount):
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm)
class Meta:
class Meta(object):
unique_together = ("realm", "property", "subgroup", "end_time")
index_together = ["property", "end_time"]
def __str__(self) -> Text:
return "<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
def __unicode__(self):
# type: () -> Text
return u"<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
class UserCount(BaseCount):
user = models.ForeignKey(UserProfile, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
user = models.ForeignKey(UserProfile)
realm = models.ForeignKey(Realm)
class Meta:
class Meta(object):
unique_together = ("user", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from users to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> Text:
return "<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
def __unicode__(self):
# type: () -> Text
return u"<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
class StreamCount(BaseCount):
stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
stream = models.ForeignKey(Stream)
realm = models.ForeignKey(Realm)
class Meta:
class Meta(object):
unique_together = ("stream", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from streams to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> Text:
return "<StreamCount: %s %s %s %s %s>" % (
self.stream, self.property, self.subgroup, self.value, self.id)
def __unicode__(self):
# type: () -> Text
return u"<StreamCount: %s %s %s %s %s>" % (self.stream, self.property, self.subgroup, self.value, self.id)

View File

@@ -1,8 +1,4 @@
from datetime import datetime, timedelta
from typing import Any, Dict, List, Optional, Text, Tuple, Type, Union
import ujson
from django.apps import apps
from django.db import models
from django.db.models import Sum
@@ -10,22 +6,23 @@ from django.test import TestCase
from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import COUNT_STATS, CountStat, DataCollector, \
DependentCountStat, LoggingCountStat, do_aggregate_to_summary_table, \
do_drop_all_analytics_tables, do_drop_single_stat, \
do_fill_count_stat_at_hour, do_increment_logging_stat, \
process_count_stat, sql_data_collector
from analytics.models import Anomaly, BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch, last_successful_fill
from zerver.lib.actions import do_activate_user, do_create_user, \
do_deactivate_user, do_reactivate_user, update_user_activity_interval, \
do_invite_users, do_revoke_user_invite, do_resend_user_invite_email, \
InvitationError
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day
from zerver.models import Client, Huddle, Message, Realm, \
RealmAuditLog, Recipient, Stream, UserActivityInterval, \
UserProfile, get_client, get_user, PreregistrationUser
from analytics.lib.counts import CountStat, COUNT_STATS, process_count_stat, \
do_fill_count_stat_at_hour, do_increment_logging_stat, DataCollector, \
sql_data_collector, LoggingCountStat, do_aggregate_to_summary_table, \
do_drop_all_analytics_tables, do_drop_single_stat, DependentCountStat
from analytics.models import BaseCount, InstallationCount, RealmCount, \
UserCount, StreamCount, FillState, Anomaly, installation_epoch, \
last_successful_fill
from zerver.lib.actions import do_create_user, do_deactivate_user, \
do_activate_user, do_reactivate_user, update_user_activity_interval
from zerver.lib.timestamp import floor_to_day, TimezoneNotUTCException
from zerver.models import Realm, UserProfile, Message, Stream, Recipient, \
Huddle, Client, UserActivityInterval, RealmAuditLog, get_client, get_user
from datetime import datetime, timedelta
import ujson
from typing import Any, Dict, List, Optional, Text, Tuple, Type, Union
class AnalyticsTestCase(TestCase):
MINUTE = timedelta(seconds = 60)
@@ -34,7 +31,8 @@ class AnalyticsTestCase(TestCase):
TIME_ZERO = datetime(1988, 3, 14).replace(tzinfo=timezone_utc)
TIME_LAST_HOUR = TIME_ZERO - HOUR
def setUp(self) -> None:
def setUp(self):
# type: () -> None
self.default_realm = Realm.objects.create(
string_id='realmtest', name='Realm Test', date_created=self.TIME_ZERO - 2*self.DAY)
# used to generate unique names in self.create_*
@@ -43,7 +41,8 @@ class AnalyticsTestCase(TestCase):
self.current_property = None # type: Optional[str]
# Lightweight creation of users, streams, and messages
def create_user(self, **kwargs: Any) -> UserProfile:
def create_user(self, **kwargs):
# type: (**Any) -> UserProfile
self.name_counter += 1
defaults = {
'email': 'user%s@domain.tld' % (self.name_counter,),
@@ -58,7 +57,8 @@ class AnalyticsTestCase(TestCase):
kwargs[key] = kwargs.get(key, value)
return UserProfile.objects.create(**kwargs)
def create_stream_with_recipient(self, **kwargs: Any) -> Tuple[Stream, Recipient]:
def create_stream_with_recipient(self, **kwargs):
# type: (**Any) -> Tuple[Stream, Recipient]
self.name_counter += 1
defaults = {'name': 'stream name %s' % (self.name_counter,),
'realm': self.default_realm,
@@ -69,7 +69,8 @@ class AnalyticsTestCase(TestCase):
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
return stream, recipient
def create_huddle_with_recipient(self, **kwargs: Any) -> Tuple[Huddle, Recipient]:
def create_huddle_with_recipient(self, **kwargs):
# type: (**Any) -> Tuple[Huddle, Recipient]
self.name_counter += 1
defaults = {'huddle_hash': 'hash%s' % (self.name_counter,)}
for key, value in defaults.items():
@@ -78,7 +79,8 @@ class AnalyticsTestCase(TestCase):
recipient = Recipient.objects.create(type_id=huddle.id, type=Recipient.HUDDLE)
return huddle, recipient
def create_message(self, sender: UserProfile, recipient: Recipient, **kwargs: Any) -> Message:
def create_message(self, sender, recipient, **kwargs):
# type: (UserProfile, Recipient, **Any) -> Message
defaults = {
'sender': sender,
'recipient': recipient,
@@ -91,9 +93,9 @@ class AnalyticsTestCase(TestCase):
return Message.objects.create(**kwargs)
# kwargs should only ever be a UserProfile or Stream.
def assertCountEquals(self, table: Type[BaseCount], value: int, property: Optional[Text]=None,
subgroup: Optional[Text]=None, end_time: datetime=TIME_ZERO,
realm: Optional[Realm]=None, **kwargs: models.Model) -> None:
def assertCountEquals(self, table, value, property=None, subgroup=None,
end_time=TIME_ZERO, realm=None, **kwargs):
# type: (Type[BaseCount], int, Optional[Text], Optional[Text], datetime, Optional[Realm], **models.Model) -> None
if property is None:
property = self.current_property
queryset = table.objects.filter(property=property, end_time=end_time).filter(**kwargs)
@@ -105,8 +107,8 @@ class AnalyticsTestCase(TestCase):
queryset = queryset.filter(subgroup=subgroup)
self.assertEqual(queryset.values_list('value', flat=True)[0], value)
def assertTableState(self, table: Type[BaseCount], arg_keys: List[str],
arg_values: List[List[object]]) -> None:
def assertTableState(self, table, arg_keys, arg_values):
# type: (Type[BaseCount], List[str], List[List[Union[int, str, bool, datetime, Realm, UserProfile, Stream]]]) -> None
"""Assert that the state of a *Count table is what it should be.
Example usage:
@@ -151,18 +153,20 @@ class AnalyticsTestCase(TestCase):
self.assertEqual(table.objects.count(), len(arg_values))
class TestProcessCountStat(AnalyticsTestCase):
def make_dummy_count_stat(self, property: str) -> CountStat:
def make_dummy_count_stat(self, property):
# type: (str) -> CountStat
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, property)
return CountStat(property, sql_data_collector(RealmCount, query, None), CountStat.HOUR)
def assertFillStateEquals(self, stat: CountStat, end_time: datetime,
state: int=FillState.DONE) -> None:
def assertFillStateEquals(self, stat, end_time, state=FillState.DONE):
# type: (CountStat, datetime, int) -> None
fill_state = FillState.objects.filter(property=stat.property).first()
self.assertEqual(fill_state.end_time, end_time)
self.assertEqual(fill_state.state, state)
def test_process_stat(self) -> None:
def test_process_stat(self):
# type: () -> None
# process new stat
current_time = installation_epoch() + self.HOUR
stat = self.make_dummy_count_stat('test stat')
@@ -188,7 +192,8 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertFillStateEquals(stat, current_time)
self.assertEqual(InstallationCount.objects.filter(property=stat.property).count(), 2)
def test_bad_fill_to_time(self) -> None:
def test_bad_fill_to_time(self):
# type: () -> None
stat = self.make_dummy_count_stat('test stat')
with self.assertRaises(ValueError):
process_count_stat(stat, installation_epoch() + 65*self.MINUTE)
@@ -198,7 +203,8 @@ class TestProcessCountStat(AnalyticsTestCase):
# This tests the LoggingCountStat branch of the code in do_delete_counts_at_hour.
# It is important that do_delete_counts_at_hour not delete any of the collected
# logging data!
def test_process_logging_stat(self) -> None:
def test_process_logging_stat(self):
# type: () -> None
end_time = self.TIME_ZERO
user_stat = LoggingCountStat('user stat', UserCount, CountStat.DAY)
@@ -220,13 +226,9 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertTableState(UserCount, ['property', 'value'], [[user_stat.property, 5]])
self.assertTableState(StreamCount, ['property', 'value'], [[stream_stat.property, 5]])
self.assertTableState(RealmCount, ['property', 'value'],
[[user_stat.property, 5],
[stream_stat.property, 5],
[realm_stat.property, 5]])
[[user_stat.property, 5], [stream_stat.property, 5], [realm_stat.property, 5]])
self.assertTableState(InstallationCount, ['property', 'value'],
[[user_stat.property, 5],
[stream_stat.property, 5],
[realm_stat.property, 5]])
[[user_stat.property, 5], [stream_stat.property, 5], [realm_stat.property, 5]])
# Change the logged data and mark FillState as dirty
UserCount.objects.update(value=6)
@@ -240,21 +242,17 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertTableState(UserCount, ['property', 'value'], [[user_stat.property, 6]])
self.assertTableState(StreamCount, ['property', 'value'], [[stream_stat.property, 6]])
self.assertTableState(RealmCount, ['property', 'value'],
[[user_stat.property, 6],
[stream_stat.property, 6],
[realm_stat.property, 6]])
[[user_stat.property, 6], [stream_stat.property, 6], [realm_stat.property, 6]])
self.assertTableState(InstallationCount, ['property', 'value'],
[[user_stat.property, 6],
[stream_stat.property, 6],
[realm_stat.property, 6]])
[[user_stat.property, 6], [stream_stat.property, 6], [realm_stat.property, 6]])
def test_process_dependent_stat(self) -> None:
def test_process_dependent_stat(self):
# type: () -> None
stat1 = self.make_dummy_count_stat('stat1')
stat2 = self.make_dummy_count_stat('stat2')
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat3')
stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None),
CountStat.HOUR,
stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None), CountStat.HOUR,
dependencies=['stat1', 'stat2'])
hour = [installation_epoch() + i*self.HOUR for i in range(5)]
@@ -287,8 +285,7 @@ class TestProcessCountStat(AnalyticsTestCase):
# test daily dependent stat with hourly dependencies
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat4')
stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None),
CountStat.DAY,
stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None), CountStat.DAY,
dependencies=['stat1', 'stat2'])
hour24 = installation_epoch() + 24*self.HOUR
hour25 = installation_epoch() + 25*self.HOUR
@@ -299,8 +296,9 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertFillStateEquals(stat4, hour24)
class TestCountStats(AnalyticsTestCase):
def setUp(self) -> None:
super().setUp()
def setUp(self):
# type: () -> None
super(TestCountStats, self).setUp()
# This tests two things for each of the queries/CountStats: Handling
# more than 1 realm, and the time bounds (time_start and time_end in
# the queries).
@@ -328,7 +326,8 @@ class TestCountStats(AnalyticsTestCase):
# This huddle should not show up anywhere
self.create_huddle_with_recipient()
def test_active_users_by_is_bot(self) -> None:
def test_active_users_by_is_bot(self):
# type: () -> None
stat = COUNT_STATS['active_users:is_bot:day']
self.current_property = stat.property
@@ -346,21 +345,19 @@ class TestCountStats(AnalyticsTestCase):
[[2, 'true'], [1, 'false'],
[3, 'false', self.second_realm],
[1, 'false', self.no_message_realm]])
self.assertTableState(InstallationCount,
['value', 'subgroup'],
[[2, 'true'], [5, 'false']])
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[2, 'true'], [5, 'false']])
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot(self) -> None:
def test_messages_sent_by_is_bot(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.create(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_human1 = Recipient.objects.create(type_id=human1.id, type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
@@ -381,7 +378,8 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[3, 'false'], [3, 'true']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type(self) -> None:
def test_messages_sent_by_message_type(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
@@ -443,7 +441,8 @@ class TestCountStats(AnalyticsTestCase):
[2, 'huddle_message']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_recipients_with_same_id(self) -> None:
def test_messages_sent_to_recipients_with_same_id(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
@@ -462,7 +461,8 @@ class TestCountStats(AnalyticsTestCase):
self.assertCountEquals(UserCount, 1, subgroup='huddle_message')
self.assertCountEquals(UserCount, 1, subgroup='public_stream')
def test_messages_sent_by_client(self) -> None:
def test_messages_sent_by_client(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:client:day']
self.current_property = stat.property
@@ -497,7 +497,8 @@ class TestCountStats(AnalyticsTestCase):
[[4, website_client_id], [3, client2_id]])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_stream_by_is_bot(self) -> None:
def test_messages_sent_to_stream_by_is_bot(self):
# type: () -> None
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
@@ -534,13 +535,14 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[5, 'false'], [2, 'true']])
self.assertTableState(UserCount, [], [])
def create_interval(self, user: UserProfile, start_offset: timedelta,
end_offset: timedelta) -> None:
def create_interval(self, user, start_offset, end_offset):
# type: (UserProfile, timedelta, timedelta) -> None
UserActivityInterval.objects.create(
user_profile=user, start=self.TIME_ZERO-start_offset,
end=self.TIME_ZERO-end_offset)
def test_15day_actives(self) -> None:
def test_15day_actives(self):
# type: () -> None
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
@@ -583,7 +585,8 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_minutes_active(self) -> None:
def test_minutes_active(self):
# type: () -> None
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
@@ -631,14 +634,16 @@ class TestDoAggregateToSummaryTable(AnalyticsTestCase):
# feature important for keeping the size of the analytics tables small,
# which is that if there is no relevant data in the table being
# aggregated, the aggregation table doesn't get a row with value 0.
def test_no_aggregated_zeros(self) -> None:
def test_no_aggregated_zeros(self):
# type: () -> None
stat = LoggingCountStat('test stat', UserCount, CountStat.HOUR)
do_aggregate_to_summary_table(stat, self.TIME_ZERO)
self.assertFalse(RealmCount.objects.exists())
self.assertFalse(InstallationCount.objects.exists())
class TestDoIncrementLoggingStat(AnalyticsTestCase):
def test_table_and_id_args(self) -> None:
def test_table_and_id_args(self):
# type: () -> None
# For realms, streams, and users, tests that the new rows are going to
# the appropriate *Count table, and that using a different zerver_object
# results in a new row being created
@@ -663,7 +668,8 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
do_increment_logging_stat(stream2, stat, None, self.TIME_ZERO)
self.assertTableState(StreamCount, ['stream'], [[stream1], [stream2]])
def test_frequency(self) -> None:
def test_frequency(self):
# type: () -> None
times = [self.TIME_ZERO - self.MINUTE*i for i in [0, 1, 61, 24*60+1]]
stat = LoggingCountStat('day test', RealmCount, CountStat.DAY)
@@ -680,7 +686,8 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
[1, 'hour test', self.TIME_LAST_HOUR],
[1, 'hour test', self.TIME_ZERO - self.DAY]])
def test_get_or_create(self) -> None:
def test_get_or_create(self):
# type: () -> None
stat = LoggingCountStat('test', RealmCount, CountStat.HOUR)
# All these should trigger the create part of get_or_create.
# property is tested in test_frequency, and id_args are tested in test_id_args,
@@ -698,7 +705,8 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
[[2, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]])
def test_increment(self) -> None:
def test_increment(self):
# type: () -> None
stat = LoggingCountStat('test', RealmCount, CountStat.DAY)
self.current_property = 'test'
do_increment_logging_stat(self.default_realm, stat, None, self.TIME_ZERO, increment=-1)
@@ -709,7 +717,8 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
self.assertTableState(RealmCount, ['value'], [[3]])
class TestLoggingCountStats(AnalyticsTestCase):
def test_aggregation(self) -> None:
def test_aggregation(self):
# type: () -> None
stat = LoggingCountStat('realm test', RealmCount, CountStat.DAY)
do_increment_logging_stat(self.default_realm, stat, None, self.TIME_ZERO)
process_count_stat(stat, self.TIME_ZERO)
@@ -731,7 +740,8 @@ class TestLoggingCountStats(AnalyticsTestCase):
self.assertTableState(UserCount, ['property', 'value'], [['user test', 1]])
self.assertTableState(StreamCount, ['property', 'value'], [['stream test', 1]])
def test_active_users_log_by_is_bot(self) -> None:
def test_active_users_log_by_is_bot(self):
# type: () -> None
property = 'active_users_log:is_bot:day'
user = do_create_user('email', 'password', self.default_realm, 'full_name', 'short_name')
self.assertEqual(1, RealmCount.objects.filter(property=property, subgroup=False)
@@ -749,47 +759,9 @@ class TestLoggingCountStats(AnalyticsTestCase):
self.assertEqual(1, RealmCount.objects.filter(property=property, subgroup=False)
.aggregate(Sum('value'))['value__sum'])
def test_invites_sent(self) -> None:
property = 'invites_sent::day'
def assertInviteCountEquals(count: int) -> None:
self.assertEqual(count, RealmCount.objects.filter(property=property, subgroup=None)
.aggregate(Sum('value'))['value__sum'])
user = self.create_user(email='first@domain.tld')
stream, _ = self.create_stream_with_recipient()
do_invite_users(user, ['user1@domain.tld', 'user2@domain.tld'], [stream])
assertInviteCountEquals(2)
# We currently send emails when re-inviting users that haven't
# turned into accounts, so count them towards the total
do_invite_users(user, ['user1@domain.tld', 'user2@domain.tld'], [stream])
assertInviteCountEquals(4)
# Test mix of good and malformed invite emails
try:
do_invite_users(user, ['user3@domain.tld', 'malformed'], [stream])
except InvitationError:
pass
assertInviteCountEquals(4)
# Test inviting existing users
try:
do_invite_users(user, ['first@domain.tld', 'user4@domain.tld'], [stream])
except InvitationError:
pass
assertInviteCountEquals(5)
# Revoking invite should not give you credit
do_revoke_user_invite(PreregistrationUser.objects.filter(realm=user.realm).first())
assertInviteCountEquals(5)
# Resending invite should cost you
do_resend_user_invite_email(PreregistrationUser.objects.first())
assertInviteCountEquals(6)
class TestDeleteStats(AnalyticsTestCase):
def test_do_drop_all_analytics_tables(self) -> None:
def test_do_drop_all_analytics_tables(self):
# type: () -> None
user = self.create_user()
stream = self.create_stream_with_recipient()[0]
count_args = {'property': 'test', 'end_time': self.TIME_ZERO, 'value': 10}
@@ -809,7 +781,8 @@ class TestDeleteStats(AnalyticsTestCase):
for table in list(analytics.models.values()):
self.assertFalse(table.objects.exists())
def test_do_drop_single_stat(self) -> None:
def test_do_drop_single_stat(self):
# type: () -> None
user = self.create_user()
stream = self.create_stream_with_recipient()[0]
count_args_to_delete = {'property': 'to_delete', 'end_time': self.TIME_ZERO, 'value': 10}
@@ -837,14 +810,15 @@ class TestDeleteStats(AnalyticsTestCase):
self.assertTrue(table.objects.filter(property='to_save').exists())
class TestActiveUsersAudit(AnalyticsTestCase):
def setUp(self) -> None:
super().setUp()
def setUp(self):
# type: () -> None
super(TestActiveUsersAudit, self).setUp()
self.user = self.create_user()
self.stat = COUNT_STATS['active_users_audit:is_bot:day']
self.current_property = self.stat.property
def add_event(self, event_type: str, days_offset: float,
user: Optional[UserProfile]=None) -> None:
def add_event(self, event_type, days_offset, user=None):
# type: (str, float, Optional[UserProfile]) -> None
hours_offset = int(24*days_offset)
if user is None:
user = self.user
@@ -852,25 +826,29 @@ class TestActiveUsersAudit(AnalyticsTestCase):
realm=user.realm, modified_user=user, event_type=event_type,
event_time=self.TIME_ZERO - hours_offset*self.HOUR)
def test_user_deactivated_in_future(self) -> None:
def test_user_deactivated_in_future(self):
# type: () -> None
self.add_event('user_created', 1)
self.add_event('user_deactivated', 0)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
def test_user_reactivated_in_future(self) -> None:
def test_user_reactivated_in_future(self):
# type: () -> None
self.add_event('user_deactivated', 1)
self.add_event('user_reactivated', 0)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], [])
def test_user_active_then_deactivated_same_day(self) -> None:
def test_user_active_then_deactivated_same_day(self):
# type: () -> None
self.add_event('user_created', 1)
self.add_event('user_deactivated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], [])
def test_user_unactive_then_activated_same_day(self) -> None:
def test_user_unactive_then_activated_same_day(self):
# type: () -> None
self.add_event('user_deactivated', 1)
self.add_event('user_reactivated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
@@ -878,20 +856,23 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# Arguably these next two tests are duplicates of the _in_future tests, but are
# a guard against future refactorings where they may no longer be duplicates
def test_user_active_then_deactivated_with_day_gap(self) -> None:
def test_user_active_then_deactivated_with_day_gap(self):
# type: () -> None
self.add_event('user_created', 2)
self.add_event('user_deactivated', 1)
process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup', 'end_time'],
[['false', self.TIME_ZERO - self.DAY]])
def test_user_deactivated_then_reactivated_with_day_gap(self) -> None:
def test_user_deactivated_then_reactivated_with_day_gap(self):
# type: () -> None
self.add_event('user_deactivated', 2)
self.add_event('user_reactivated', 1)
process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
def test_event_types(self) -> None:
def test_event_types(self):
# type: () -> None
self.add_event('user_created', 4)
self.add_event('user_deactivated', 3)
self.add_event('user_activated', 2)
@@ -903,7 +884,8 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# Also tests that aggregation to RealmCount and InstallationCount is
# being done, and that we're storing the user correctly in UserCount
def test_multiple_users_realms_and_bots(self) -> None:
def test_multiple_users_realms_and_bots(self):
# type: () -> None
user1 = self.create_user()
user2 = self.create_user()
second_realm = Realm.objects.create(string_id='moo', name='moo')
@@ -927,7 +909,8 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# do_fill_count_stat_at_hour. E.g. if one changes self.stat.frequency to
# CountStat.HOUR from CountStat.DAY, this will fail, while many of the
# tests above will not.
def test_update_from_two_days_ago(self) -> None:
def test_update_from_two_days_ago(self):
# type: () -> None
self.add_event('user_created', 2)
process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup', 'end_time'],
@@ -936,27 +919,31 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# User with no relevant activity could happen e.g. for a system bot that
# doesn't go through do_create_user. Mainly just want to make sure that
# that situation doesn't throw an error.
def test_empty_realm_or_user_with_no_relevant_activity(self) -> None:
def test_empty_realm_or_user_with_no_relevant_activity(self):
# type: () -> None
self.add_event('unrelated', 1)
self.create_user() # also test a user with no RealmAuditLog entries
Realm.objects.create(string_id='moo', name='moo')
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], [])
def test_max_audit_entry_is_unrelated(self) -> None:
def test_max_audit_entry_is_unrelated(self):
# type: () -> None
self.add_event('user_created', 1)
self.add_event('unrelated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
# Simultaneous related audit entries should not be allowed, and so not testing for that.
def test_simultaneous_unrelated_audit_entry(self) -> None:
def test_simultaneous_unrelated_audit_entry(self):
# type: () -> None
self.add_event('user_created', 1)
self.add_event('unrelated', 1)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
def test_simultaneous_max_audit_entries_of_different_users(self) -> None:
def test_simultaneous_max_audit_entries_of_different_users(self):
# type: () -> None
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user()
@@ -968,7 +955,8 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.assertTableState(UserCount, ['user', 'subgroup'],
[[user1, 'false'], [user2, 'false']])
def test_end_to_end_with_actions_dot_py(self) -> None:
def test_end_to_end_with_actions_dot_py(self):
# type: () -> None
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
user3 = do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')
@@ -985,26 +973,30 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.assertFalse(UserCount.objects.filter(user=user2).exists())
class TestRealmActiveHumans(AnalyticsTestCase):
def setUp(self) -> None:
super().setUp()
def setUp(self):
# type: () -> None
super(TestRealmActiveHumans, self).setUp()
self.stat = COUNT_STATS['realm_active_humans::day']
self.current_property = self.stat.property
def mark_audit_active(self, user: UserProfile, end_time: Optional[datetime]=None) -> None:
def mark_audit_active(self, user, end_time=None):
# type: (UserProfile, Optional[datetime]) -> None
if end_time is None:
end_time = self.TIME_ZERO
UserCount.objects.create(
user=user, realm=user.realm, property='active_users_audit:is_bot:day',
subgroup=ujson.dumps(user.is_bot), end_time=end_time, value=1)
def mark_15day_active(self, user: UserProfile, end_time: Optional[datetime]=None) -> None:
def mark_15day_active(self, user, end_time=None):
# type: (UserProfile, Optional[datetime]) -> None
if end_time is None:
end_time = self.TIME_ZERO
UserCount.objects.create(
user=user, realm=user.realm, property='15day_actives::day',
end_time=end_time, value=1)
def test_basic_boolean_logic(self) -> None:
def test_basic_boolean_logic(self):
# type: () -> None
user = self.create_user()
self.mark_audit_active(user, end_time=self.TIME_ZERO - self.DAY)
self.mark_15day_active(user, end_time=self.TIME_ZERO)
@@ -1015,14 +1007,16 @@ class TestRealmActiveHumans(AnalyticsTestCase):
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO + i*self.DAY)
self.assertTableState(RealmCount, ['value', 'end_time'], [[1, self.TIME_ZERO + self.DAY]])
def test_bots_not_counted(self) -> None:
def test_bots_not_counted(self):
# type: () -> None
bot = self.create_user(is_bot=True)
self.mark_audit_active(bot)
self.mark_15day_active(bot)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(RealmCount, [], [])
def test_multiple_users_realms_and_times(self) -> None:
def test_multiple_users_realms_and_times(self):
# type: () -> None
user1 = self.create_user()
user2 = self.create_user()
second_realm = Realm.objects.create(string_id='second', name='second')
@@ -1062,7 +1056,8 @@ class TestRealmActiveHumans(AnalyticsTestCase):
[1, self.default_realm, self.TIME_ZERO - self.DAY],
[2, second_realm, self.TIME_ZERO - self.DAY]])
def test_end_to_end(self) -> None:
def test_end_to_end(self):
# type: () -> None
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')

View File

@@ -1,10 +1,12 @@
from zerver.lib.test_classes import ZulipTestCase
from analytics.lib.counts import CountStat
from analytics.lib.fixtures import generate_time_series_data
from zerver.lib.test_classes import ZulipTestCase
# A very light test suite; the code being tested is not run in production.
class TestFixtures(ZulipTestCase):
def test_deterministic_settings(self) -> None:
def test_deterministic_settings(self):
# type: () -> None
# test basic business_hour / non_business_hour calculation
# test we get an array of the right length with frequency=CountStat.DAY
data = generate_time_series_data(

View File

@@ -1,22 +1,25 @@
from datetime import datetime, timedelta
from typing import Dict, List, Optional
import mock
from django.utils.timezone import utc
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.lib.time_utils import time_range
from analytics.models import FillState, \
RealmCount, UserCount, last_successful_fill
from analytics.views import get_chart_data, rewrite_client_arrays, \
sort_by_totals, sort_client_labels, stats
from django.utils.timezone import get_fixed_timezone, utc
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, datetime_to_timestamp
from zerver.models import Client, get_realm
from zerver.lib.timestamp import ceiling_to_hour, ceiling_to_day, \
datetime_to_timestamp
from zerver.models import Realm, UserProfile, Client, get_realm
from analytics.lib.counts import CountStat, COUNT_STATS
from analytics.lib.time_utils import time_range
from analytics.models import RealmCount, UserCount, BaseCount, \
FillState, last_successful_fill
from analytics.views import stats, get_chart_data, sort_by_totals, \
sort_client_labels, rewrite_client_arrays
from datetime import datetime, timedelta
import mock
import ujson
from typing import List, Dict, Optional
class TestStatsEndpoint(ZulipTestCase):
def test_stats(self) -> None:
def test_stats(self):
# type: () -> None
self.user = self.example_user('hamlet')
self.login(self.user.email)
result = self.client_get('/stats')
@@ -25,7 +28,8 @@ class TestStatsEndpoint(ZulipTestCase):
self.assert_in_response("Zulip analytics for", result)
class TestGetChartData(ZulipTestCase):
def setUp(self) -> None:
def setUp(self):
# type: () -> None
self.realm = get_realm('zulip')
self.user = self.example_user('hamlet')
self.login(self.user.email)
@@ -34,11 +38,12 @@ class TestGetChartData(ZulipTestCase):
self.end_times_day = [ceiling_to_day(self.realm.date_created) + timedelta(days=i)
for i in range(4)]
def data(self, i: int) -> List[int]:
def data(self, i):
# type: (int) -> List[int]
return [0, 0, i, 0]
def insert_data(self, stat: CountStat, realm_subgroups: List[Optional[str]],
user_subgroups: List[str]) -> None:
def insert_data(self, stat, realm_subgroups, user_subgroups):
# type: (CountStat, List[Optional[str]], List[str]) -> None
if stat.frequency == CountStat.HOUR:
insert_time = self.end_times_hour[2]
fill_time = self.end_times_hour[-1]
@@ -56,7 +61,8 @@ class TestGetChartData(ZulipTestCase):
for i, subgroup in enumerate(user_subgroups)])
FillState.objects.create(property=stat.property, end_time=fill_time, state=FillState.DONE)
def test_number_of_humans(self) -> None:
def test_number_of_humans(self):
# type: () -> None
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data',
@@ -72,7 +78,8 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_messages_sent_over_time(self) -> None:
def test_messages_sent_over_time(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.insert_data(stat, ['true', 'false'], ['false'])
result = self.client_get('/json/analytics/chart_data',
@@ -89,7 +96,8 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_messages_sent_by_message_type(self) -> None:
def test_messages_sent_by_message_type(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:message_type:day']
self.insert_data(stat, ['public_stream', 'private_message'],
['public_stream', 'private_stream'])
@@ -109,7 +117,8 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_messages_sent_by_client(self) -> None:
def test_messages_sent_by_client(self):
# type: () -> None
stat = COUNT_STATS['messages_sent:client:day']
client1 = Client.objects.create(name='client 1')
client2 = Client.objects.create(name='client 2')
@@ -132,10 +141,10 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_include_empty_subgroups(self) -> None:
def test_include_empty_subgroups(self):
# type: () -> None
FillState.objects.create(
property='realm_active_humans::day', end_time=self.end_times_day[0],
state=FillState.DONE)
property='realm_active_humans::day', end_time=self.end_times_day[0], state=FillState.DONE)
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'number_of_humans'})
self.assert_json_success(result)
@@ -144,8 +153,7 @@ class TestGetChartData(ZulipTestCase):
self.assertFalse('user' in data)
FillState.objects.create(
property='messages_sent:is_bot:hour', end_time=self.end_times_hour[0],
state=FillState.DONE)
property='messages_sent:is_bot:hour', end_time=self.end_times_hour[0], state=FillState.DONE)
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
@@ -154,22 +162,18 @@ class TestGetChartData(ZulipTestCase):
self.assertEqual(data['user'], {'human': [0], 'bot': [0]})
FillState.objects.create(
property='messages_sent:message_type:day', end_time=self.end_times_day[0],
state=FillState.DONE)
property='messages_sent:message_type:day', end_time=self.end_times_day[0], state=FillState.DONE)
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_by_message_type'})
self.assert_json_success(result)
data = result.json()
self.assertEqual(data['realm'], {
'Public streams': [0], 'Private streams': [0],
'Private messages': [0], 'Group private messages': [0]})
'Public streams': [0], 'Private streams': [0], 'Private messages': [0], 'Group private messages': [0]})
self.assertEqual(data['user'], {
'Public streams': [0], 'Private streams': [0],
'Private messages': [0], 'Group private messages': [0]})
'Public streams': [0], 'Private streams': [0], 'Private messages': [0], 'Group private messages': [0]})
FillState.objects.create(
property='messages_sent:client:day', end_time=self.end_times_day[0],
state=FillState.DONE)
property='messages_sent:client:day', end_time=self.end_times_day[0], state=FillState.DONE)
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_by_client'})
self.assert_json_success(result)
@@ -177,7 +181,8 @@ class TestGetChartData(ZulipTestCase):
self.assertEqual(data['realm'], {})
self.assertEqual(data['user'], {})
def test_start_and_end(self) -> None:
def test_start_and_end(self):
# type: () -> None
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
end_time_timestamps = [datetime_to_timestamp(dt) for dt in self.end_times_day]
@@ -199,7 +204,8 @@ class TestGetChartData(ZulipTestCase):
'end': end_time_timestamps[1]})
self.assert_json_error_contains(result, 'Start time is later than')
def test_min_length(self) -> None:
def test_min_length(self):
# type: () -> None
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
# test min_length is too short to change anything
@@ -220,12 +226,14 @@ class TestGetChartData(ZulipTestCase):
self.assertEqual(data['end_times'], [datetime_to_timestamp(dt) for dt in end_times])
self.assertEqual(data['realm'], {'human': [0]+self.data(100)})
def test_non_existent_chart(self) -> None:
def test_non_existent_chart(self):
# type: () -> None
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'does_not_exist'})
self.assert_json_error_contains(result, 'Unknown chart name')
def test_analytics_not_running(self) -> None:
def test_analytics_not_running(self):
# type: () -> None
# try to get data for a valid chart, but before we've put anything in the database
# (e.g. before update_analytics_counts has been run)
with mock.patch('logging.warning'):
@@ -236,7 +244,8 @@ class TestGetChartData(ZulipTestCase):
class TestGetChartDataHelpers(ZulipTestCase):
# last_successful_fill is in analytics/models.py, but get_chart_data is
# the only function that uses it at the moment
def test_last_successful_fill(self) -> None:
def test_last_successful_fill(self):
# type: () -> None
self.assertIsNone(last_successful_fill('non-existant'))
a_time = datetime(2016, 3, 14, 19).replace(tzinfo=utc)
one_hour_before = datetime(2016, 3, 14, 18).replace(tzinfo=utc)
@@ -247,18 +256,21 @@ class TestGetChartDataHelpers(ZulipTestCase):
fillstate.save()
self.assertEqual(last_successful_fill('property'), one_hour_before)
def test_sort_by_totals(self) -> None:
def test_sort_by_totals(self):
# type: () -> None
empty = [] # type: List[int]
value_arrays = {'c': [0, 1], 'a': [9], 'b': [1, 1, 1], 'd': empty}
self.assertEqual(sort_by_totals(value_arrays), ['a', 'b', 'c', 'd'])
def test_sort_client_labels(self) -> None:
def test_sort_client_labels(self):
# type: () -> None
data = {'realm': {'a': [16], 'c': [15], 'b': [14], 'e': [13], 'd': [12], 'h': [11]},
'user': {'a': [6], 'b': [5], 'd': [4], 'e': [3], 'f': [2], 'g': [1]}}
self.assertEqual(sort_client_labels(data), ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h'])
class TestTimeRange(ZulipTestCase):
def test_time_range(self) -> None:
def test_time_range(self):
# type: () -> None
HOUR = timedelta(hours=1)
DAY = timedelta(days=1)
@@ -284,7 +296,8 @@ class TestTimeRange(ZulipTestCase):
[floor_day-2*DAY, floor_day-DAY, floor_day, floor_day+DAY])
class TestMapArrays(ZulipTestCase):
def test_map_arrays(self) -> None:
def test_map_arrays(self):
# type: () -> None
a = {'desktop app 1.0': [1, 2, 3],
'desktop app 2.0': [10, 12, 13],
'desktop app 3.0': [21, 22, 23],

View File

@@ -1,7 +1,7 @@
from django.conf.urls import include, url
from django.conf.urls import url, include
from zerver.lib.rest import rest_dispatch
import analytics.views
from zerver.lib.rest import rest_dispatch
i18n_urlpatterns = [
# Server admin (user_profile.is_staff) visible stats pages

View File

@@ -1,52 +1,56 @@
import itertools
import json
import logging
import re
import time
from collections import defaultdict
from datetime import datetime, timedelta
from typing import Any, Callable, Dict, List, \
Optional, Set, Text, Tuple, Type, Union
import pytz
from django.conf import settings
from django.urls import reverse
from django.core import urlresolvers
from django.db import connection
from django.db.models import Sum
from django.db.models.query import QuerySet
from django.http import HttpRequest, HttpResponse, HttpResponseNotFound
from django.shortcuts import render
from django.http import HttpResponseNotFound, HttpRequest, HttpResponse
from django.template import RequestContext, loader
from django.utils.timezone import now as timezone_now, utc as timezone_utc
from django.utils.timezone import now as timezone_now
from django.utils.translation import ugettext as _
from django.shortcuts import render
from jinja2 import Markup as mark_safe
from analytics.lib.counts import COUNT_STATS, CountStat, process_count_stat
from analytics.lib.counts import CountStat, process_count_stat, COUNT_STATS
from analytics.lib.time_utils import time_range
from analytics.models import BaseCount, InstallationCount, \
RealmCount, StreamCount, UserCount, last_successful_fill
from zerver.decorator import require_server_admin, \
to_non_negative_int, to_utc_datetime, zulip_login_required
from zerver.lib.exceptions import JsonableError
from zerver.lib.request import REQ, has_request_variables
from analytics.models import BaseCount, InstallationCount, RealmCount, \
UserCount, StreamCount, last_successful_fill
from zerver.decorator import has_request_variables, REQ, require_server_admin, \
zulip_login_required, to_non_negative_int, to_utc_datetime
from zerver.lib.request import JsonableError
from zerver.lib.response import json_success
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, convert_to_UTC, timestamp_to_datetime
from zerver.models import Client, get_realm, Realm, \
UserActivity, UserActivityInterval, UserProfile
from zerver.lib.timestamp import ceiling_to_hour, ceiling_to_day, \
timestamp_to_datetime, convert_to_UTC
from zerver.models import Realm, UserProfile, UserActivity, \
UserActivityInterval, Client
from collections import defaultdict
from datetime import datetime, timedelta
import itertools
import json
import logging
import pytz
import re
import time
from six.moves import filter, map, range, zip
from typing import Any, Callable, Dict, List, Optional, Set, Text, \
Tuple, Type, Union
@zulip_login_required
def stats(request: HttpRequest) -> HttpResponse:
def stats(request):
# type: (HttpRequest) -> HttpResponse
return render(request,
'analytics/stats.html',
context=dict(realm_name = request.user.realm.name))
@has_request_variables
def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name: Text=REQ(),
min_length: Optional[int]=REQ(converter=to_non_negative_int, default=None),
start: Optional[datetime]=REQ(converter=to_utc_datetime, default=None),
end: Optional[datetime]=REQ(converter=to_utc_datetime, default=None)) -> HttpResponse:
def get_chart_data(request, user_profile, chart_name=REQ(),
min_length=REQ(converter=to_non_negative_int, default=None),
start=REQ(converter=to_utc_datetime, default=None),
end=REQ(converter=to_utc_datetime, default=None)):
# type: (HttpRequest, UserProfile, Text, Optional[int], Optional[datetime], Optional[datetime]) -> HttpResponse
if chart_name == 'number_of_humans':
stat = COUNT_STATS['realm_active_humans::day']
tables = [RealmCount]
@@ -115,7 +119,8 @@ def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name:
data['display_order'] = None
return json_success(data=data)
def sort_by_totals(value_arrays: Dict[str, List[int]]) -> List[str]:
def sort_by_totals(value_arrays):
# type: (Dict[str, List[int]]) -> List[str]
totals = [(sum(values), label) for label, values in value_arrays.items()]
totals.sort(reverse=True)
return [label for total, label in totals]
@@ -126,7 +131,8 @@ def sort_by_totals(value_arrays: Dict[str, List[int]]) -> List[str]:
# understanding the realm's traffic and the user's traffic. This function
# tries to rank the clients so that taking the first N elements of the
# sorted list has a reasonable chance of doing so.
def sort_client_labels(data: Dict[str, Dict[str, List[int]]]) -> List[str]:
def sort_client_labels(data):
# type: (Dict[str, Dict[str, List[int]]]) -> List[str]
realm_order = sort_by_totals(data['realm'])
user_order = sort_by_totals(data['user'])
label_sort_values = {} # type: Dict[str, float]
@@ -137,7 +143,8 @@ def sort_client_labels(data: Dict[str, Dict[str, List[int]]]) -> List[str]:
return [label for label, sort_value in sorted(label_sort_values.items(),
key=lambda x: x[1])]
def table_filtered_to_id(table: Type[BaseCount], key_id: int) -> QuerySet:
def table_filtered_to_id(table, key_id):
# type: (Type[BaseCount], int) -> QuerySet
if table == RealmCount:
return RealmCount.objects.filter(realm_id=key_id)
elif table == UserCount:
@@ -149,7 +156,8 @@ def table_filtered_to_id(table: Type[BaseCount], key_id: int) -> QuerySet:
else:
raise AssertionError("Unknown table: %s" % (table,))
def client_label_map(name: str) -> str:
def client_label_map(name):
# type: (str) -> str
if name == "website":
return "Website"
if name.startswith("desktop app"):
@@ -168,7 +176,8 @@ def client_label_map(name: str) -> str:
return name[len("Zulip"):-len("Webhook")] + " webhook"
return name
def rewrite_client_arrays(value_arrays: Dict[str, List[int]]) -> Dict[str, List[int]]:
def rewrite_client_arrays(value_arrays):
# type: (Dict[str, List[int]]) -> Dict[str, List[int]]
mapped_arrays = {} # type: Dict[str, List[int]]
for label, array in value_arrays.items():
mapped_label = client_label_map(label)
@@ -179,12 +188,8 @@ def rewrite_client_arrays(value_arrays: Dict[str, List[int]]) -> Dict[str, List[
mapped_arrays[mapped_label] = [value_arrays[label][i] for i in range(0, len(array))]
return mapped_arrays
def get_time_series_by_subgroup(stat: CountStat,
table: Type[BaseCount],
key_id: int,
end_times: List[datetime],
subgroup_to_label: Dict[Optional[str], str],
include_empty_subgroups: bool) -> Dict[str, List[int]]:
def get_time_series_by_subgroup(stat, table, key_id, end_times, subgroup_to_label, include_empty_subgroups):
# type: (CountStat, Type[BaseCount], int, List[datetime], Dict[Optional[str], str], bool) -> Dict[str, List[int]]
queryset = table_filtered_to_id(table, key_id).filter(property=stat.property) \
.values_list('subgroup', 'end_time', 'value')
value_dicts = defaultdict(lambda: defaultdict(int)) # type: Dict[Optional[str], Dict[datetime, int]]
@@ -205,10 +210,12 @@ def get_time_series_by_subgroup(stat: CountStat,
eastern_tz = pytz.timezone('US/Eastern')
def make_table(title: str, cols: List[str], rows: List[Any], has_row_class: bool=False) -> str:
def make_table(title, cols, rows, has_row_class=False):
# type: (str, List[str], List[Any], bool) -> str
if not has_row_class:
def fix_row(row: Any) -> Dict[str, Any]:
def fix_row(row):
# type: (Any) -> Dict[str, Any]
return dict(cells=row, row_class=None)
rows = list(map(fix_row, rows))
@@ -221,7 +228,8 @@ def make_table(title: str, cols: List[str], rows: List[Any], has_row_class: bool
return content
def dictfetchall(cursor: connection.cursor) -> List[Dict[str, Any]]:
def dictfetchall(cursor):
# type: (connection.cursor) -> List[Dict[str, Any]]
"Returns all rows from a cursor as a dict"
desc = cursor.description
return [
@@ -230,7 +238,8 @@ def dictfetchall(cursor: connection.cursor) -> List[Dict[str, Any]]:
]
def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
def get_realm_day_counts():
# type: () -> Dict[str, Dict[str, str]]
query = '''
select
r.string_id,
@@ -265,13 +274,12 @@ def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
result = {}
for string_id in counts:
raw_cnts = [counts[string_id].get(age, 0) for age in range(8)]
min_cnt = min(raw_cnts[1:])
max_cnt = max(raw_cnts[1:])
min_cnt = min(raw_cnts)
max_cnt = max(raw_cnts)
def format_count(cnt: int, style: Optional[str]=None) -> str:
if style is not None:
good_bad = style
elif cnt == min_cnt:
def format_count(cnt):
# type: (int) -> str
if cnt == min_cnt:
good_bad = 'bad'
elif cnt == max_cnt:
good_bad = 'good'
@@ -280,21 +288,18 @@ def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
return '<td class="number %s">%s</td>' % (good_bad, cnt)
cnts = (format_count(raw_cnts[0], 'neutral')
+ ''.join(map(format_count, raw_cnts[1:])))
cnts = ''.join(map(format_count, raw_cnts))
result[string_id] = dict(cnts=cnts)
return result
def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
now = timezone_now()
def realm_summary_table(realm_minutes):
# type: (Dict[str, float]) -> str
query = '''
SELECT
realm.string_id,
realm.date_created,
coalesce(user_counts.dau_count, 0) dau_count,
coalesce(wau_counts.wau_count, 0) wau_count,
coalesce(user_counts.active_user_count, 0) active_user_count,
coalesce(at_risk_counts.at_risk_count, 0) at_risk_count,
(
SELECT
count(*)
@@ -316,24 +321,22 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
(
SELECT
up.realm_id realm_id,
count(distinct(ua.user_profile_id)) dau_count
count(distinct(ua.user_profile_id)) active_user_count
FROM zerver_useractivity ua
JOIN zerver_userprofile up
ON up.id = ua.user_profile_id
WHERE
up.is_active
AND (not up.is_bot)
AND
query in (
'/json/send_message',
'send_message_backend',
'/api/v1/send_message',
'/json/update_pointer',
'/json/users/me/pointer',
'update_pointer_backend'
'/json/users/me/pointer'
)
AND
last_visit > now() - interval '1 day'
AND
not is_bot
GROUP BY realm_id
) user_counts
ON user_counts.realm_id = realm.id
@@ -341,7 +344,7 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
(
SELECT
realm_id,
count(*) wau_count
count(*) at_risk_count
FROM (
SELECT
realm.id as realm_id,
@@ -359,37 +362,35 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
'send_message_backend',
'/api/v1/send_message',
'/json/update_pointer',
'/json/users/me/pointer',
'update_pointer_backend'
'/json/users/me/pointer'
)
GROUP by realm.id, up.email
HAVING max(last_visit) > now() - interval '7 day'
) as wau_users
HAVING max(last_visit) between
now() - interval '7 day' and
now() - interval '1 day'
) as at_risk_users
GROUP BY realm_id
) wau_counts
ON wau_counts.realm_id = realm.id
) at_risk_counts
ON at_risk_counts.realm_id = realm.id
WHERE EXISTS (
SELECT *
FROM zerver_useractivity ua
JOIN zerver_userprofile up
ON up.id = ua.user_profile_id
WHERE
up.realm_id = realm.id
AND up.is_active
AND (not up.is_bot)
AND
query in (
'/json/send_message',
'/api/v1/send_message',
'send_message_backend',
'/json/update_pointer',
'/json/users/me/pointer',
'update_pointer_backend'
'/json/users/me/pointer'
)
AND
up.realm_id = realm.id
AND
last_visit > now() - interval '2 week'
)
ORDER BY dau_count DESC, string_id ASC
ORDER BY active_user_count DESC, string_id ASC
'''
cursor = connection.cursor()
@@ -397,21 +398,6 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
rows = dictfetchall(cursor)
cursor.close()
# Fetch all the realm administrator users
realm_admins = defaultdict(list) # type: Dict[str, List[str]]
for up in UserProfile.objects.select_related("realm").filter(
is_realm_admin=True,
is_active=True
):
realm_admins[up.realm.string_id].append(up.email)
for row in rows:
row['date_created_day'] = row['date_created'].strftime('%Y-%m-%d')
row['age_days'] = int((now - row['date_created']).total_seconds()
/ 86400)
row['is_new'] = row['age_days'] < 12 * 7
row['realm_admin_email'] = ', '.join(realm_admins[row['string_id']])
# get messages sent per day
counts = get_realm_day_counts()
for row in rows:
@@ -429,7 +415,7 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
total_hours += hours
row['hours'] = str(int(hours))
try:
row['hours_per_user'] = '%.1f' % (hours / row['dau_count'],)
row['hours_per_user'] = '%.1f' % (hours / row['active_user_count'],)
except Exception:
pass
@@ -438,42 +424,41 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
row['string_id'] = realm_activity_link(row['string_id'])
# Count active sites
def meets_goal(row: Dict[str, int]) -> bool:
return row['dau_count'] >= 5
def meets_goal(row):
# type: (Dict[str, int]) -> bool
return row['active_user_count'] >= 5
num_active_sites = len(list(filter(meets_goal, rows)))
# create totals
total_dau_count = 0
total_active_user_count = 0
total_user_profile_count = 0
total_bot_count = 0
total_wau_count = 0
total_at_risk_count = 0
for row in rows:
total_dau_count += int(row['dau_count'])
total_active_user_count += int(row['active_user_count'])
total_user_profile_count += int(row['user_profile_count'])
total_bot_count += int(row['bot_count'])
total_wau_count += int(row['wau_count'])
total_at_risk_count += int(row['at_risk_count'])
rows.append(dict(
string_id='Total',
date_created_day='',
realm_admin_email='',
dau_count=total_dau_count,
active_user_count=total_active_user_count,
user_profile_count=total_user_profile_count,
bot_count=total_bot_count,
hours=int(total_hours),
wau_count=total_wau_count,
at_risk_count=total_at_risk_count,
))
content = loader.render_to_string(
'analytics/realm_summary_table.html',
dict(rows=rows, num_active_sites=num_active_sites,
now=now.strftime('%Y-%m-%dT%H:%M:%SZ'))
dict(rows=rows, num_active_sites=num_active_sites)
)
return content
def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
def user_activity_intervals():
# type: () -> Tuple[mark_safe, Dict[str, float]]
day_end = timestamp_to_datetime(time.time())
day_start = day_end - timedelta(hours=24)
@@ -523,7 +508,8 @@ def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
content = mark_safe('<pre>' + output + '</pre>')
return content, realm_minutes
def sent_messages_report(realm: str) -> str:
def sent_messages_report(realm):
# type: (str) -> str
title = 'Recently sent messages for ' + realm
cols = [
@@ -590,16 +576,18 @@ def sent_messages_report(realm: str) -> str:
return make_table(title, cols, rows)
def ad_hoc_queries() -> List[Dict[str, str]]:
def get_page(query: str, cols: List[str], title: str) -> Dict[str, str]:
def ad_hoc_queries():
# type: () -> List[Dict[str, str]]
def get_page(query, cols, title):
# type: (str, List[str], str) -> Dict[str, str]
cursor = connection.cursor()
cursor.execute(query)
rows = cursor.fetchall()
rows = list(map(list, rows))
cursor.close()
def fix_rows(i: int,
fixup_func: Union[Callable[[Realm], mark_safe], Callable[[datetime], str]]) -> None:
def fix_rows(i, fixup_func):
# type: (int, Union[Callable[[Realm], mark_safe], Callable[[datetime], str]]) -> None
for row in rows:
row[i] = fixup_func(row[i])
@@ -761,7 +749,8 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
@require_server_admin
@has_request_variables
def get_activity(request: HttpRequest) -> HttpResponse:
def get_activity(request):
# type: (HttpRequest) -> HttpResponse
duration_content, realm_minutes = user_activity_intervals() # type: Tuple[mark_safe, Dict[str, float]]
counts_content = realm_summary_table(realm_minutes) # type: str
data = [
@@ -779,7 +768,8 @@ def get_activity(request: HttpRequest) -> HttpResponse:
context=dict(data=data, title=title, is_home=True),
)
def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet:
def get_user_activity_records_for_realm(realm, is_bot):
# type: (str, bool) -> QuerySet
fields = [
'user_profile__full_name',
'user_profile__email',
@@ -798,7 +788,8 @@ def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet:
records = records.select_related('user_profile', 'client').only(*fields)
return records
def get_user_activity_records_for_email(email: str) -> List[QuerySet]:
def get_user_activity_records_for_email(email):
# type: (str) -> List[QuerySet]
fields = [
'user_profile__full_name',
'query',
@@ -814,7 +805,8 @@ def get_user_activity_records_for_email(email: str) -> List[QuerySet]:
records = records.select_related('user_profile', 'client').only(*fields)
return records
def raw_user_activity_table(records: List[QuerySet]) -> str:
def raw_user_activity_table(records):
# type: (List[QuerySet]) -> str
cols = [
'query',
'client',
@@ -822,7 +814,8 @@ def raw_user_activity_table(records: List[QuerySet]) -> str:
'last_visit'
]
def row(record: QuerySet) -> List[Any]:
def row(record):
# type: (QuerySet) -> List[Any]
return [
record.query,
record.client.name,
@@ -834,7 +827,8 @@ def raw_user_activity_table(records: List[QuerySet]) -> str:
title = 'Raw Data'
return make_table(title, cols, rows)
def get_user_activity_summary(records: List[QuerySet]) -> Dict[str, Dict[str, Any]]:
def get_user_activity_summary(records):
# type: (List[QuerySet]) -> Dict[str, Dict[str, Any]]
#: `Any` used above should be `Union(int, datetime)`.
#: However current version of `Union` does not work inside other function.
#: We could use something like:
@@ -842,7 +836,8 @@ def get_user_activity_summary(records: List[QuerySet]) -> Dict[str, Dict[str, An
#: but that would require this long `Union` to carry on throughout inner functions.
summary = {} # type: Dict[str, Dict[str, Any]]
def update(action: str, record: QuerySet) -> None:
def update(action, record):
# type: (str, QuerySet) -> None
if action not in summary:
summary[action] = dict(
count=record.count,
@@ -876,32 +871,35 @@ def get_user_activity_summary(records: List[QuerySet]) -> Dict[str, Dict[str, An
update('website', record)
if ('send_message' in query) or re.search('/api/.*/external/.*', query):
update('send', record)
if query in ['/json/update_pointer', '/json/users/me/pointer', '/api/v1/update_pointer',
'update_pointer_backend']:
if query in ['/json/update_pointer', '/json/users/me/pointer', '/api/v1/update_pointer']:
update('pointer', record)
update(client, record)
return summary
def format_date_for_activity_reports(date: Optional[datetime]) -> str:
def format_date_for_activity_reports(date):
# type: (Optional[datetime]) -> str
if date:
return date.astimezone(eastern_tz).strftime('%Y-%m-%d %H:%M')
else:
return ''
def user_activity_link(email: str) -> mark_safe:
def user_activity_link(email):
# type: (str) -> mark_safe
url_name = 'analytics.views.get_user_activity'
url = reverse(url_name, kwargs=dict(email=email))
url = urlresolvers.reverse(url_name, kwargs=dict(email=email))
email_link = '<a href="%s">%s</a>' % (url, email)
return mark_safe(email_link)
def realm_activity_link(realm_str: str) -> mark_safe:
def realm_activity_link(realm_str):
# type: (str) -> mark_safe
url_name = 'analytics.views.get_realm_activity'
url = reverse(url_name, kwargs=dict(realm_str=realm_str))
url = urlresolvers.reverse(url_name, kwargs=dict(realm_str=realm_str))
realm_link = '<a href="%s">%s</a>' % (url, realm_str)
return mark_safe(realm_link)
def realm_client_table(user_summaries: Dict[str, Dict[str, Dict[str, Any]]]) -> str:
def realm_client_table(user_summaries):
# type: (Dict[str, Dict[str, Dict[str, Any]]]) -> str
exclude_keys = [
'internal',
'name',
@@ -945,7 +943,8 @@ def realm_client_table(user_summaries: Dict[str, Dict[str, Dict[str, Any]]]) ->
return make_table(title, cols, rows)
def user_activity_summary_table(user_summary: Dict[str, Dict[str, Any]]) -> str:
def user_activity_summary_table(user_summary):
# type: (Dict[str, Dict[str, Any]]) -> str
rows = []
for k, v in user_summary.items():
if k == 'name':
@@ -971,29 +970,33 @@ def user_activity_summary_table(user_summary: Dict[str, Dict[str, Any]]) -> str:
title = 'User Activity'
return make_table(title, cols, rows)
def realm_user_summary_table(all_records: List[QuerySet],
admin_emails: Set[Text]) -> Tuple[Dict[str, Dict[str, Any]], str]:
def realm_user_summary_table(all_records, admin_emails):
# type: (List[QuerySet], Set[Text]) -> Tuple[Dict[str, Dict[str, Any]], str]
user_records = {}
def by_email(record: QuerySet) -> str:
def by_email(record):
# type: (QuerySet) -> str
return record.user_profile.email
for email, records in itertools.groupby(all_records, by_email):
user_records[email] = get_user_activity_summary(list(records))
def get_last_visit(user_summary: Dict[str, Dict[str, datetime]], k: str) -> Optional[datetime]:
def get_last_visit(user_summary, k):
# type: (Dict[str, Dict[str, datetime]], str) -> Optional[datetime]
if k in user_summary:
return user_summary[k]['last_visit']
else:
return None
def get_count(user_summary: Dict[str, Dict[str, str]], k: str) -> str:
def get_count(user_summary, k):
# type: (Dict[str, Dict[str, str]], str) -> str
if k in user_summary:
return user_summary[k]['count']
else:
return ''
def is_recent(val: Optional[datetime]) -> bool:
def is_recent(val):
# type: (Optional[datetime]) -> bool
age = timezone_now() - val
return age.total_seconds() < 5 * 60
@@ -1015,7 +1018,8 @@ def realm_user_summary_table(all_records: List[QuerySet],
row = dict(cells=cells, row_class=row_class)
rows.append(row)
def by_used_time(row: Dict[str, Any]) -> str:
def by_used_time(row):
# type: (Dict[str, Any]) -> str
return row['cells'][3]
rows = sorted(rows, key=by_used_time, reverse=True)
@@ -1038,7 +1042,8 @@ def realm_user_summary_table(all_records: List[QuerySet],
return user_records, content
@require_server_admin
def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
def get_realm_activity(request, realm_str):
# type: (HttpRequest, str) -> HttpResponse
data = [] # type: List[Tuple[str, str]]
all_user_records = {} # type: Dict[str, Any]
@@ -1073,7 +1078,8 @@ def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
)
@require_server_admin
def get_user_activity(request: HttpRequest, email: str) -> HttpResponse:
def get_user_activity(request, email):
# type: (HttpRequest, str) -> HttpResponse
records = get_user_activity_records_for_email(email)
data = [] # type: List[Tuple[str, str]]

View File

@@ -1,22 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2017-11-30 00:13
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('zerver', '0124_stream_enable_notifications'),
('confirmation', '0004_remove_confirmationmanager'),
]
operations = [
migrations.AddField(
model_name='confirmation',
name='realm',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm'),
),
]

View File

@@ -1,20 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 18:39
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('confirmation', '0005_confirmation_realm'),
]
operations = [
migrations.AddField(
model_name='realmcreationkey',
name='presume_email_valid',
field=models.BooleanField(default=False),
),
]

View File

@@ -7,9 +7,9 @@ __revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $'
import datetime
from django.db import models
from django.db.models import CASCADE
from django.urls import reverse
from django.core.urlresolvers import reverse
from django.conf import settings
from django.contrib.sites.models import Site
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.http import HttpRequest, HttpResponse
@@ -18,8 +18,7 @@ from django.utils.timezone import now as timezone_now
from zerver.lib.send_email import send_email
from zerver.lib.utils import generate_random_token
from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite, \
UserProfile, Realm
from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite
from random import SystemRandom
import string
from typing import Any, Dict, Optional, Text, Union
@@ -29,31 +28,32 @@ class ConfirmationKeyException(Exception):
EXPIRED = 2
DOES_NOT_EXIST = 3
def __init__(self, error_type: int) -> None:
super().__init__()
def __init__(self, error_type):
# type: (int) -> None
super(ConfirmationKeyException, self).__init__()
self.error_type = error_type
def render_confirmation_key_error(request: HttpRequest, exception: ConfirmationKeyException) -> HttpResponse:
def render_confirmation_key_error(request, exception):
# type: (HttpRequest, ConfirmationKeyException) -> HttpResponse
if exception.error_type == ConfirmationKeyException.WRONG_LENGTH:
return render(request, 'confirmation/link_malformed.html')
if exception.error_type == ConfirmationKeyException.EXPIRED:
return render(request, 'confirmation/link_expired.html')
return render(request, 'confirmation/link_does_not_exist.html')
def generate_key() -> str:
def generate_key():
# type: () -> str
generator = SystemRandom()
# 24 characters * 5 bits of entropy/character = 120 bits of entropy
return ''.join(generator.choice(string.ascii_lowercase + string.digits) for _ in range(24))
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(confirmation_key: str,
confirmation_type: int) -> ConfirmationObjT:
def get_object_from_key(confirmation_key):
# type: (str) -> Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
# Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
try:
confirmation = Confirmation.objects.get(confirmation_key=confirmation_key,
type=confirmation_type)
confirmation = Confirmation.objects.get(confirmation_key=confirmation_key)
except Confirmation.DoesNotExist:
raise ConfirmationKeyException(ConfirmationKeyException.DOES_NOT_EXIST)
@@ -67,17 +67,15 @@ def get_object_from_key(confirmation_key: str,
obj.save(update_fields=['status'])
return obj
def create_confirmation_link(obj: ContentType, host: str,
confirmation_type: int,
url_args: Optional[Dict[str, str]]=None) -> str:
def create_confirmation_link(obj, host, confirmation_type, url_args=None):
# type: (Union[ContentType, int], str, int, Optional[Dict[str, str]]) -> str
key = generate_key()
Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key,
realm=obj.realm, type=confirmation_type)
type=confirmation_type)
return confirmation_url(key, host, confirmation_type, url_args)
def confirmation_url(confirmation_key: str, host: str,
confirmation_type: int,
url_args: Optional[Dict[str, str]]=None) -> str:
def confirmation_url(confirmation_key, host, confirmation_type, url_args=None):
# type: (str, str, int, Optional[Dict[str, str]]) -> str
if url_args is None:
url_args = {}
url_args['confirmation_key'] = confirmation_key
@@ -85,12 +83,11 @@ def confirmation_url(confirmation_key: str, host: str,
reverse(_properties[confirmation_type].url_name, kwargs=url_args))
class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
content_type = models.ForeignKey(ContentType)
object_id = models.PositiveIntegerField() # type: int
content_object = GenericForeignKey('content_type', 'object_id')
date_sent = models.DateTimeField() # type: datetime.datetime
confirmation_key = models.CharField(max_length=40) # type: str
realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE) # type: Optional[Realm]
# The following list is the set of valid types
USER_REGISTRATION = 1
@@ -99,69 +96,51 @@ class Confirmation(models.Model):
UNSUBSCRIBE = 4
SERVER_REGISTRATION = 5
MULTIUSE_INVITE = 6
REALM_CREATION = 7
type = models.PositiveSmallIntegerField() # type: int
def __str__(self) -> Text:
def __unicode__(self):
# type: () -> Text
return '<Confirmation: %s>' % (self.content_object,)
class ConfirmationType:
def __init__(self, url_name: str,
validity_in_days: int=settings.CONFIRMATION_LINK_DEFAULT_VALIDITY_DAYS) -> None:
class ConfirmationType(object):
def __init__(self, url_name, validity_in_days=settings.CONFIRMATION_LINK_DEFAULT_VALIDITY_DAYS):
# type: (str, int) -> None
self.url_name = url_name
self.validity_in_days = validity_in_days
_properties = {
Confirmation.USER_REGISTRATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.INVITATION: ConfirmationType('check_prereg_key_and_redirect',
Confirmation.USER_REGISTRATION: ConfirmationType('confirmation.views.confirm'),
Confirmation.INVITATION: ConfirmationType('confirmation.views.confirm',
validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
Confirmation.EMAIL_CHANGE: ConfirmationType('zerver.views.user_settings.confirm_email_change'),
Confirmation.UNSUBSCRIBE: ConfirmationType('zerver.views.unsubscribe.email_unsubscribe',
validity_in_days=1000000), # should never expire
Confirmation.MULTIUSE_INVITE: ConfirmationType(
'zerver.views.registration.accounts_home_from_multiuse_invite',
validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
Confirmation.REALM_CREATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.MULTIUSE_INVITE: ConfirmationType('zerver.views.registration.accounts_home_from_multiuse_invite',
validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS)
}
# Functions related to links generated by the generate_realm_creation_link.py
# management command.
# Note that being validated here will just allow the user to access the create_realm
# form, where they will enter their email and go through the regular
# Confirmation.REALM_CREATION pathway.
# Arguably RealmCreationKey should just be another ConfirmationObjT and we should
# add another Confirmation.type for this; it's this way for historical reasons.
# Conirmation pathways for which there is no content_object that we need to
# keep track of.
def validate_key(creation_key: Optional[str]) -> Optional['RealmCreationKey']:
"""Get the record for this key, raising InvalidCreationKey if non-None but invalid."""
if creation_key is None:
return None
try:
key_record = RealmCreationKey.objects.get(creation_key=creation_key)
except RealmCreationKey.DoesNotExist:
raise RealmCreationKey.Invalid()
time_elapsed = timezone_now() - key_record.date_created
if time_elapsed.total_seconds() > settings.REALM_CREATION_LINK_VALIDITY_DAYS * 24 * 3600:
raise RealmCreationKey.Invalid()
return key_record
def check_key_is_valid(creation_key):
# type: (Text) -> bool
if not RealmCreationKey.objects.filter(creation_key=creation_key).exists():
return False
days_sofar = (timezone_now() - RealmCreationKey.objects.get(creation_key=creation_key).date_created).days
# Realm creation link expires after settings.REALM_CREATION_LINK_VALIDITY_DAYS
if days_sofar <= settings.REALM_CREATION_LINK_VALIDITY_DAYS:
return True
return False
def generate_realm_creation_url(by_admin: bool=False) -> Text:
def generate_realm_creation_url():
# type: () -> Text
key = generate_key()
RealmCreationKey.objects.create(creation_key=key,
date_created=timezone_now(),
presume_email_valid=by_admin)
return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.EXTERNAL_HOST,
reverse('zerver.views.create_realm',
kwargs={'creation_key': key}))
RealmCreationKey.objects.create(creation_key=key, date_created=timezone_now())
return u'%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.EXTERNAL_HOST,
reverse('zerver.views.create_realm',
kwargs={'creation_key': key}))
class RealmCreationKey(models.Model):
creation_key = models.CharField('activation key', max_length=40)
date_created = models.DateTimeField('created', default=timezone_now)
# True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that.
presume_email_valid = models.BooleanField(default=False) # type: bool
class Invalid(Exception):
pass

31
confirmation/views.py Normal file
View File

@@ -0,0 +1,31 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = '$Id: views.py 21 2008-12-05 09:21:03Z jarek.zgoda $'
from django.shortcuts import render
from django.template import RequestContext
from django.conf import settings
from django.http import HttpRequest, HttpResponse
from confirmation.models import Confirmation, get_object_from_key, ConfirmationKeyException, \
render_confirmation_key_error
from zerver.models import PreregistrationUser
from typing import Any, Dict
# This is currently only used for confirming PreregistrationUser.
# Do not add other confirmation paths here.
def confirm(request, confirmation_key):
# type: (HttpRequest, str) -> HttpResponse
try:
get_object_from_key(confirmation_key)
except ConfirmationKeyException as exception:
return render_confirmation_key_error(request, exception)
return render(request, 'confirmation/confirm_preregistrationuser.html',
context={
'key': confirmation_key,
'full_name': request.GET.get("full_name", None)})

View File

@@ -1,5 +1,5 @@
from django.conf.urls import url
from django.views.generic import TemplateView
from django.views.generic import TemplateView, RedirectView
i18n_urlpatterns = [
# Zephyr/MIT

View File

@@ -2,7 +2,7 @@
#
# You can set these variables from the command line.
SPHINXOPTS = -j8
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build

View File

@@ -1,4 +1,4 @@
# Documentation systems
# Documentation
Zulip has three major documentation systems:
@@ -58,8 +58,7 @@ Markdown, though that won't be as faithful as the `make html`
approach.
When editing dependencies for the Zulip documentation, you should edit
`requirements/docs.in` and then run `tools/update-locked-requirements`
which updates docs.txt file (which is used by ReadTheDocs to build the
`requirements/docs.txt` (which is used by ReadTheDocs to build the
Zulip developer documentation, without installing all of Zulip's
dependencies).

View File

@@ -37,7 +37,7 @@ Copyright: 2011, Krzysztof Wilczynski
2011, Puppet Labs Inc
License: Apache-2.0
File: puppet/zulip_ops/files/mediawiki/Auth_remoteuser.php
File: puppet/zulip_internal/files/mediawiki/Auth_remoteuser.php
Copyright: 2006 Otheus Shelling
2007 Rusty Burchfield
2009 James Kinsman
@@ -59,24 +59,15 @@ Files: puppet/zulip/files/nagios_plugins/zulip_nagios_server/check_website_respo
Copyright: 2011 Chris Freeman
License: GPL-2.0
Files: puppet/zulip_ops/files/trac/cgi-bin/
Files: puppet/zulip_internal/files/trac/cgi-bin/
Copyright: 2003-2009 Edgewall Software
2003-2004 Jonas Borgström <jonas@edgewall.com>
License: BSD-3-Clause
Files: puppet/zulip_ops/files/zulip-ec2-configure-interfaces
Files: puppet/zulip_internal/files/zulip-ec2-configure-interfaces
Copyright: 2013-2017, Dropbox, Inc., Kandra Labs, Inc., and contributors
License: Expat
Files: scripts/setup/generate-self-signed-cert
Copyright: 2003-2006 Thom May
2006 Fabio M. Di Nitto
2006 Adam Conrad
2006-2008 Tollef Fog Heen
2008-2015 Stefan Fritsch
2018 Kandra Labs, Inc., and contributors
License: BSD-3-Clause
Files: static/audio/zulip.*
Copyright: 2011 Vidsyn
License: CC-0-1.0
@@ -191,10 +182,6 @@ Files: zerver/lib/decorator.py zerver/management/commands/runtornado.py scripts/
Copyright: Django Software Foundation and individual contributors
License: BSD-3-Clause
Files: zerver/lib/json_encoder_for_html.py zerver/tests/test_json_encoder_for_html.py
Copyright: 2006 Bob Ippolito
License: MIT or Academic Free License v. 2.1
License: Apache-2.0
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.

BIN
docs/_static/zulip-create-realm.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

View File

@@ -6,10 +6,11 @@ Key Codebases
The core Zulip application is at
[<https://github.com/zulip/zulip>](https://github.com/zulip/zulip) and
is a web application written in Python 3.x and using the Django framework. That
codebase includes server-side code and the web client, as well as Python API
bindings and most of our integrations with other services and applications (see
[the directory structure guide](../overview/directory-structure.html)).
is a web application written in Python 2.7 (soon to also support
Python 3) and using the Django framework. That codebase includes
server-side code and the web client, as well as Python API bindings
and most of our integrations with other services and applications (see
[the directory structure guide](directory-structure.html)).
[Zulip Mobile](https://github.com/zulip/zulip-mobile) is the official
mobile Zulip client supporting both iOS and Android, written in
@@ -56,12 +57,12 @@ choose whether to allow anyone to register an account and join, or
only allow people who have been invited, or restrict registrations to
members of particular groups (using email domain names or corporate
single-sign-on login for verification). For more on security
considerations, see [the security model section](../production/security-model.html).
considerations, see [the security model section](security-model.html).
The Zulip "All messages" screen is like a chronologically ordered inbox;
The default Zulip home screen is like a chronologically ordered inbox;
it displays messages, starting at the oldest message that the user
hasn't viewed yet (for more on that logic, see [the guide to the
pointer and unread counts](../subsystems/pointer.html)). The "All messages" screen displays
pointer and unread counts](pointer.html)). The home screen displays
the most recent messages in all the streams a user has joined (except
for the streams they've muted), as well as private messages from other
users, in strict chronological order. A user can *narrow* to view only
@@ -80,7 +81,7 @@ real-time notifications they find irrelevant.
Components
----------
![architecture-simple](../images/architecture_simple.png)
![architecture-simple](images/architecture_simple.png)
### Django and Tornado
@@ -112,7 +113,7 @@ exception to this is that Zulip uses websockets through Tornado to
minimize latency on the code path for **sending** messages.
There is detailed documentation on the
[real-time push and event queue system](../subsystems/events-system.html); most of
[real-time push and event queue system](events-system.html); most of
the code is in `zerver/tornado`.
#### HTML templates, JavaScript, etc.
@@ -125,10 +126,10 @@ live-rendering HTML from JavaScript for things like the main message
feed.
For more details on the frontend, see our documentation on
[translation](../translating/translating.html),
[templates](../subsystems/html-templates.html),
[directory structure](../overview/directory-structure.html), and
[the static asset pipeline](../subsystems/front-end-build-process.html).
[translation](translating.html),
[templates](html-templates.html),
[directory structure](directory-structure.html), and
[the static asset pipeline](front-end-build-process.html).
[Jinja2]: http://jinja.pocoo.org/
[Handlebars]: http://handlebarsjs.com/
@@ -179,7 +180,7 @@ processes that process event queues. We use event queues for the kinds
of tasks that are best run in the background because they are
expensive (in terms of performance) and don't have to be synchronous
--- e.g., sending emails or updating analytics. Also see [the queuing
guide](../subsystems/queuing.html).
guide](queuing.html).
### memcached
@@ -226,7 +227,7 @@ processes started by Supervisor are queue processors that continually
pull things out of a RabbitMQ queue and handle them; they are defined
in `zerver/worker/queue_processors.py`.
Also see [the queuing guide](../subsystems/queuing.html).
Also see [the queuing guide](queuing.html).
### PostgreSQL
@@ -294,8 +295,8 @@ are welcome!
* **star**: Zulip allows a user to mark any message they can see,
public or private, as "starred". A user can easily access messages
they've starred through the "Starred messages" link in the
left sidebar, or use "is:starred" as a narrow or a search
they've starred through the "Starred messages" link in the menu
near "Home", or use "is:starred" as a narrow or a search
constraint. Whether a user has or has not starred a particular
message is private; other users and realm admins don't know
whether a message has been starred, or by whom.

View File

@@ -0,0 +1,119 @@
# Vagrant environment setup (in brief)
Start by cloning this repository: `git clone https://github.com/zulip/zulip.git`
This is the recommended approach for all platforms and will install
the Zulip development environment inside a VM or container and works
on any platform that supports Vagrant.
The best performing way to run the Zulip development environment is
using an LXC container on a Linux host, but we support other platforms
such as Mac via Virtualbox (but everything will be 2-3x slower).
* If your host is Ubuntu 15.04 or newer, you can install and configure
the LXC Vagrant provider directly using apt:
```
sudo apt-get install vagrant lxc lxc-templates cgroup-lite redir
vagrant plugin install vagrant-lxc
```
You may want to [configure sudo to be passwordless when using Vagrant LXC][avoiding-sudo].
* If your host is Ubuntu 14.04, you will need to [download a newer
version of Vagrant][vagrant-dl], and then do the following:
```
sudo apt-get install lxc lxc-templates cgroup-lite redir
sudo dpkg -i vagrant*.deb # in directory where you downloaded vagrant
vagrant plugin install vagrant-lxc
```
You may want to [configure sudo to be passwordless when using Vagrant LXC][avoiding-sudo].
* For other Linux hosts with a kernel above 3.12, [follow the Vagrant
LXC installation instructions][vagrant-lxc] to get Vagrant with LXC
for your platform.
* If your host is macOS or older Linux, [download Vagrant][vagrant-dl]
and [VirtualBox][vbox-dl]. Or, instead of Virtualbox you can use
[VMWare Fusion][vmware-fusion-dl] with the [VMWare vagrant
provider][vagrant-vmware-fusion-dl] for a nonfree option with better
performance.
* On Windows: You can use Vagrant and Virtualbox/VMWare on Windows
with Cygwin, similar to the Mac setup. Be sure to create your git
clone using `git clone https://github.com/zulip/zulip.git -c
core.autocrlf=false` to avoid Windows line endings being added to
files (this causes weird errors).
[vagrant-dl]: https://www.vagrantup.com/downloads.html
[vagrant-lxc]: https://github.com/fgrehm/vagrant-lxc
[vbox-dl]: https://www.virtualbox.org/wiki/Downloads
[vmware-fusion-dl]: http://www.vmware.com/products/fusion.html
[vagrant-vmware-fusion-dl]: https://www.vagrantup.com/vmware/
[avoiding-sudo]: https://github.com/fgrehm/vagrant-lxc#avoiding-sudo-passwords
Once that's done, simply change to your zulip directory and run
`vagrant up` in your terminal to install the development server. This
will take a long time on the first run because Vagrant needs to
download the Ubuntu Trusty base image, but later you can run `vagrant
destroy` and then `vagrant up` again to rebuild the environment and it
will be much faster.
Once that finishes, you can run the development server as follows:
```
vagrant ssh
# Now inside the container
/srv/zulip/tools/run-dev.py
```
To get shell access to the virtual machine running the server to run
lint, management commands, etc., use `vagrant ssh`.
At this point you should [read about using the development
environment][using-dev].
[using-dev]: using-dev-environment.html
### Specifying a proxy
If you need to use a proxy server to access the Internet, you will
need to specify the proxy settings before running `Vagrant up`.
First, install the Vagrant plugin `vagrant-proxyconf`:
```
vagrant plugin install vagrant-proxyconf.
```
Then create `~/.zulip-vagrant-config` and add the following lines to
it (with the appropriate values in it for your proxy):
```
HTTP_PROXY http://proxy_host:port
HTTPS_PROXY http://proxy_host:port
NO_PROXY localhost,127.0.0.1,.example.com
```
Now run `vagrant up` in your terminal to install the development
server. If you ran `vagrant up` before and failed, you'll need to run
`vagrant destroy` first to clean up the failed installation.
You can also change the port on the host machine that Vagrant uses by
adding to your `~/.zulip-vagrant-config` file. E.g. if you set:
```
HOST_PORT 9971
```
(and halt and restart the Vagrant guest), then you would visit
http://localhost:9971/ to connect to your development server.
If you'd like to be able to connect to your development environment from other
machines than the VM host, you can manually set the host IP address in the
'~/.zulip-vagrant-config' file as well. For example, if you set:
```
HOST_IP_ADDR 0.0.0.0
```
(and restart the Vagrant guest), your host IP would be 0.0.0.0, a special value
for the IP address that means any IP address can connect to your development server.

View File

@@ -7,214 +7,6 @@ All notable changes to the Zulip server are documented in this file.
This section lists notable unreleased changes; it is generally updated
in bursts.
### 1.8.0 -- 2018-04-17
**Highlights:**
- Dramatically simplified the server installation process; it's now possible
to install Zulip without first setting up outgoing email.
- Added experimental support for importing an organization's history
from Slack.
- Added a new "night mode" theme for dark environments.
- Added a video call integration powered by Jitsi.
- Lots of visual polish improvements.
- Countless small bugfixes both in the backend and the UI.
**Security and privacy:**
- Several important security fixes since 1.7.0, which were released
already in 1.7.1 and 1.7.2.
- The security model for private streams has changed. Now
organization administrators can remove users, edit descriptions, and
rename private streams they are not subscribed to. See Zulip's
security model documentation for details.
- On Xenial, the local uploads backend now does the same security
checks that the S3 backend did before serving files to users.
Ubuntu Trusty's version of nginx is too old to support this and so
the legacy model is the default; we recommend upgrading.
- Added an organization setting to limit creation of bots.
- Refactored the authentication backends codebase to be much easier to
verify.
- Added a user setting to control whether email notifications include
message content (or just the fact that there are new messages).
**Visual and UI:**
- Added a user setting to translate emoticons/smileys to emoji.
- Added a user setting to choose the emoji set used in Zulip: Google,
Twitter, Apple, or Emoji One.
- Expanded setting for displaying emoji as text to cover all display
settings (previously only affected reactions).
- Overhauled our settings system to eliminate the old "save changes"
button system.
- Redesigned the "uploaded files" UI.
- Redesigned the "account settings" UI.
- Redesigned error pages for the various email confirmation flows.
- Our emoji now display at full resolution on retina displays.
- Improved placement of text when inserting emoji via picker.
- Improved the descriptions and UI for many settings.
- Improved visual design of the help center (/help/).
**Core chat experience:**
- Added support for mentioning groups of users.
- Added a setting to allow users to delete their messages.
- Added support for uploading files in the message-edit UI.
- Redesigned the compose are for private messages to use pretty pills
rather than raw email addresses to display recipients.
- Added new ctrl+B, ctrl+I, ctrl+L compose shortcuts for inserting
common syntax.
- Added warning when linking to a private stream via typeahead.
- Added support for automatically-numbered markdown lists.
- Added a big warning when posting to #announce.
- Added a notification when drafts are saved, to make them more
discoverable.
- Added a fast local echo to emoji reactions.
- Messages containing just a link to an image (or an uploaded image)
now don't clutter the feed with the URL: we just display the image.
- Redesigned the API for emoji reactions to support the full range of
how emoji reactions are used.
- Fixed most of the known (mostly obscure) bugs in how messages are
formatted in Zulip.
- Fixed "more topics" to correctly display all historical topics for
public streams, even though from before a user subscribed.
- Added a menu item to mark all messages as read.
- Fixed image upload file pickers offering non-image files.
- Fixed some subtle bugs with full-text search and unicode.
- Fixed bugs in the "edit history" HTML rendering process.
- Fixed popovers being closed when new messages come in.
- Fixed unexpected code blocks when using the email mirror.
- Fixed clicking on links to a narrow opening a new window.
- Fixed several subtle bugs with the email gateway system.
- Fixed layering issues with mobile Safari.
- Fixed several obscure real-time synchronization bugs.
- Fixed handling of messages with a very large HTML rendering.
- Fixed several bugs around interacting with deactivated users.
- Fixed interaction bugs with unread counts and deleting messages.
- Fixed support for replacing deactivated custom emoji.
- Fixed scrolling downwards in narrows.
- Optimized how user avatar URLs are transmitted over the wire.
- Optimized message sending performance a bit more.
- Fixed a subtle and hard-to-reproduce bug that resulted in every
message being condensed ([More] appearing on every message).
- Improved typeahead's handling of editing an already-completed mention.
- Improved syntax for inline LaTeX to be more convenient.
- Improved syntax for permanent links to streams in Zulip.
- Improved behavior of copy-pasting a large number of messages.
- Improved handling of browser undo in compose.
- Improved saved drafts system to garbage-collect old drafts and sort
by last modification, not creation.
- Removed the legacy "Zulip labs" autoscroll_forever setting. It was
enabled mostly by accident.
- Removed some long-deprecated markdown syntax for mentions.
- Added support for clicking on a mention to see a user's profile.
- Links to logged-in content in Zulip now take the user to the
appropriate upload or view after a user logs in.
- Renamed "Home" to "All messages", to avoid users clicking on it too
early in using Zulip.
- Added a user setting to control whether the organization's name is
included in email subject lines.
- Fixed uploading user avatars encoded using the CMYK mode.
**User accounts and invites:**
- Added support for users in multiple realms having the same email.
- Added a display for whether the user is logged-in in logged-out
pages.
- Added support for inviting a new user as an administrator.
- Added a new organization settings page for managing invites.
- Added rate-limiting on inviting users to join a realm (prevents spam).
- Added an organization setting to disable welcome emails to new users.
- Added an organization setting to ban disposable email addresses
(I.e.. those from sites like mailinator.com).
- Improved the password reset flow to be less confusing if you don't
have an account.
- Split the Notifications Stream setting in two settings, one for new
users, the other for new streams.
**Stream subscriptions and settings:**
- Added traffic statistics (messages/week) to the "Manage streams" UI.
- Fixed numerous issues in the "stream settings" UI.
- Fixed numerous subtle bugs with the stream creation UI.
- Changes the URL scheme for stream narrows to encode the stream ID,
so that they can be robust to streams being renamed. The change is
backwards-compatible; existing narrow URLs still work.
**API, bots, and integrations:**
- Rewrote our API documentation to be much more friendly and
expansive; it now covers most important endpoints, with nice examples.
- New integrations: ErrBot, GoCD, Google Code-In, Opbeat, Groove,
Raygun, Insping, Dialogflow, Dropbox, Front, Intercom,
Statuspage.io, Flock and Beeminder.
- Added support for embedded interactive bots.
- Added inline preview + player for Vimeo videos.
- Added new event types and fixed bugs in several webhook integrations.
- Added support for default bots to receive messages when they're
mentioned, even if they are not subscribed.
- Added support for overriding the topic is all incoming webhook integrations.
- Incoming webhooks now send a private message to the bot owner for
more convenient testing if a stream is not specified.
- Rewrote documentation for many integrations to use a cleaner
numbered-list format.
- APIs for fetching messages now provide more metadata to help clients.
**Keyboard shortcuts:**
- Added new "basics" section to keyboard shortcuts documentation.
- Added a new ">" keyboard shortcut for quote-and-reply.
- Added a new "p" keyboard shortcut to just to next unread PM thread.
- Fixed several hotkeys scope bugs.
- Changed the hotkey for compose-private-message from "C" to "x".
- Improve keyboard navigation of left and right sidebars with arrow keys.
**Mobile apps backend:**
- Added support for logging into the mobile apps with RemoteUserBackend.
- Improved mobile notifications to support narrowing when one clicks a
mobile push notification.
- Statistics on the fraction of strings that are translated now
include strings in the mobile apps as well.
**For server admins:**
- Added certbot support to the installer for getting certificates.
- Added support for hosting multiple domains, not all as subdomains of
the same base domain.
- Added a new nagios check for the Zulip analytics state.
- Fixed buggy APNs logic that could cause extra exception emails.
- Fixed a missing dependency for the localhost_sso auth backend.
- Fixed subtle bugs in garbage-collection of old node_modules versions.
- Clarified instructions for server settings (especially LDAP auth).
- Added missing information on requesting user in many exception emails.
- Improved Tornado retry logic for connecting to RabbitMQ.
- Added a server setting to control whether digest emails are sent.
**For Zulip developers:**
- Migrated the codebase to use the nice Python 3 typing syntax.
- Added a new /team/ page explaining the team, with a nice
visualization of our contributors.
- Dramatically improved organization of developer docs.
- Backend test coverage is now 95%.
### 1.7.2 -- 2018-04-12
This is a security release, with a handful of cherry-picked changes
since 1.7.1. All Zulip server admins are encouraged to upgrade
promptly.
- CVE-2018-9986: Fix XSS issues with frontend markdown processor.
- CVE-2018-9987: Fix XSS issue with muting notifications.
- CVE-2018-9990: Fix XSS issue with stream names in topic typeahead.
- CVE-2018-9999: Fix XSS issue with user uploads. The fix for this
adds a Content-Security-Policy for the `LOCAL_UPLOADS_DIR` storage
backend for user-uploaded files.
Thanks to Suhas Sunil Gaikwad for reporting CVE-2018-9987 and w2w for
reporting CVE-2018-9986 and CVE-2018-9990.
### 1.7.1 -- 2017-11-21
This is a security release, with a handful of cherry-picked changes
@@ -290,9 +82,9 @@ Backend and scaling
minimizes disruption by running these first, before beginning the
user-facing downtime. However, if you'd like to watch the downtime
phase of the upgrade closely, we recommend
[running them first manually](../production/expensive-migrations.html) and as well
[running them first manually](expensive-migrations.html) and as well
as the usual trick of
[doing an apt upgrade first](../production/maintain-secure-upgrade.html#applying-ubuntu-system-updates).
[doing an apt upgrade first](prod-maintain-secure-upgrade.html#applying-ubuntu-system-updates).
* We've removed support for an uncommon legacy deployment model where
a Zulip server served multiple organizations on the same domain.
@@ -302,7 +94,7 @@ Backend and scaling
This change should have no effect for the vast majority of Zulip
servers that only have one organization. If you manage a server
that hosts multiple organizations, you'll want to read [our guide on
multiple organizations](../production/multiple-organizations.html).
multiple organizations](prod-multiple-organizations.html).
* We simplified the configuration for our password strength checker to
be much more intuitive. If you were using the
@@ -450,7 +242,7 @@ Zulip apps.
Hungarian, Polish, Dutch, Russian, Bulgarian, Portuguese,
Serbian, Malayalam, Korean, and Italian).
[mobile-push]: ../production/mobile-push-notifications.html
[mobile-push]: https://zulip.readthedocs.io/en/latest/prod-mobile-push-notifications.html
[electron-app]: https://github.com/zulip/zulip-electron/releases
[ios-app]: https://itunes.apple.com/us/app/zulip/id1203036395

View File

@@ -1,20 +1,32 @@
# The chat.zulip.org community
[https://chat.zulip.org](https://chat.zulip.org/) is the primary communication
forum for the Zulip community.
[chat.zulip.org](https://chat.zulip.org/) is the primary communication
forum for the Zulip community. It is a Zulip server that you can
connect to from any modern web browser.
You can go through the simple signup process at that link, and then
you will soon be talking to core Zulip developers and other users. To
get help in real time, you will have the best luck finding core
developers roughly between 17:00 UTC and 6:00 UTC, but the sun never
sets on the Zulip community. Most questions get a reply within
developers roughly between 17:00 UTC and 2:00 UTC or during [office
hours and sprints](#office-hours-and-sprints), but the sun never
sleeps on the Zulip community. Most questions get a reply within
minutes to a few hours, depending on the time of day.
## Community norms
## This is a bleeding edge development server
* Send test messages to
The chat.zulip.org server is frequently deployed off of `master` from
the Zulip Git repository, so please point out anything you notice that
seems wrong! We catch many bugs that escape code review this way.
The chat.zulip.org server is a development and testing server, not a
production service, so don't use it for anything mission-critical,
secret/embarrassing, etc.
## Community conventions
* Send any test messages to
[#test here](https://chat.zulip.org/#narrow/stream/test.20here) or
as a PM to yourself to avoid disturbing others.
as a PM to yourself to avoid disrupting others.
* When asking for help, provide the details needed for others to help
you. E.g. include the full traceback in a code block (not a
screenshot), a link to the code or a WIP PR you're having trouble
@@ -28,11 +40,8 @@ minutes to a few hours, depending on the time of day.
Mentioning other users is great for timely questions or making sure
someone who is not online sees your message.
* Converse informally; there's no need to use titles like "Sir" or "Madam".
* Use
[gender-neutral language](https://en.wikipedia.org/wiki/Gender-neutral_language).
For example, avoid using a pronoun like her or his in sentences like
"Every developer should clean [their] keyboard at least once a week."
* Follow the community [code of conduct](../code-of-conduct.html).
* Use gender-neutral language.
* Follow the [community code of conduct](code-of-conduct.html).
* Participate! Zulip is a friendly and welcoming community, and we
love meeting new people, hearing about what brought them to Zulip,
and getting their feedback. If you're not sure where to start,
@@ -52,16 +61,6 @@ To make the best use of your time, we highly recommend that you
unsubscribe from streams that you aren't interested in, and mute
streams that are only of occasional interest.
## This is a bleeding edge development server
The chat.zulip.org server is frequently deployed off of `master` from
the Zulip Git repository, so please point out anything you notice that
seems wrong! We catch many bugs that escape code review this way.
The chat.zulip.org server is a development and testing server, not a
production service, so don't use it for anything mission-critical,
secret/embarrassing, etc.
## Streams
There are a few streams worth highlighting that are relevant for
@@ -73,7 +72,7 @@ everyone, even non-developers:
* [#feedback](https://chat.zulip.org/#narrow/stream/feedback) is for
posting feedback on Zulip.
* [#design](https://chat.zulip.org/#narrow/stream/design) is where we
discuss UI and feature design and collect feedback on potential design
discuss the UI design and collect feedback on potential design
changes. We love feedback, so don't hesitate to speak up!
* [#user community](https://chat.zulip.org/#narrow/stream/user.20community) is
for Zulip users to discuss their experiences using and adopting Zulip.
@@ -124,16 +123,45 @@ meetings, and they're a great time to stop by and introduce yourself
if you'd like to get involved (though really, any time is, so).
Here are the regular meetings that exist today along with their usual
times:
times (actual times are listed in the linked agenda documents):
* Mobile team on
[#mobile](https://chat.zulip.org/#narrow/stream/mobile), generally
Wednesdays at 10AM Pacific time.
Wednesdays at 10AM Pacific time. [Agendas][mobile-agendas].
* Backend/infrastructure team on
[#backend](https://chat.zulip.org/#narrow/stream/backend), generally
Fridays at 10AM Pacific time.
Fridays at 10AM Pacific time. [Agendas][infra-agendas].
* Bots and integrations team on
[#integrations](https://chat.zulip.org/#narrow/stream/integrations),
generally Fridays at 9AM Pacific time.
generally Fridays at 9AM Pacific time. [Agendas][bots-agendas].
[mobile-agendas]: https://paper.dropbox.com/doc/Zulip-mobile-agendas-nVdb9I7SDiom9hY8Zw8Ge
[infra-agendas]: https://paper.dropbox.com/doc/Zulip-infrastructure-team-agendas-kGyCvF2u2kLcZ1Hzyd9iD
[bots-agendas]: https://paper.dropbox.com/doc/Zulip-bots-and-integrations-agendas-3MR8NAL3fg4tIEpfb5jyx
### Office hours and sprints
We also do project-wide ad-hoc "office hours" and remote sprints
irregularly, about once a month.
Anyone can schedule one: announce it in
[#announce](https://chat.zulip.org/#narrow/stream/announce) and on
[the zulip-devel mailing list](https://groups.google.com/forum/#!forum/zulip-devel)
a few days ahead of time, and ideally, tell
[Sumana](https://chat.zulip.org/#narrow/sender/18-sh) so she can put
it on [the public Zulip meetings calendar][meetings-calendar].
*Office hours* are simply times for us to informally discuss current
global project priorities, find out what questions people have, and so
on. We set them up so people know there'll be more people around at a
particular time to chat. You don't need to RSVP and you don't need to
show up on time or stop conversations when the "hour" stops. They
start in [#general](https://chat.zulip.org/#narrow/stream/general) and
conversations move into other streams and topics as they come up.
*Sprints* are times when Zulip developers get together in chat, and
sometimes in person, to work on related issues at the same time.
[meetings-calendar]: https://calendar.google.com/calendar/embed?src=ktiduof4eoh47lmgcl2qunnc0o@group.calendar.google.com

View File

@@ -72,7 +72,7 @@ group. However, "讨论组" has one more Chinese character than "频道
* Invite-Only/Public Stream - **私有/公开频道**
"Invite-Only Stream" requires users must be invited explicitly to
subscribe, which assures a high privacy. Other users cannot perceive
subscribe, which assures a high privacy. Other users can not perceive
the presence of such streams. Since literal translation is hard to
read, it is translated sense to sense as "私有频道(Private Stream)"。

View File

@@ -35,4 +35,4 @@ object as `request.client`.
In most integrations, `request.client` is then passed to
`check_send_stream_message`, where it is used to keep track of which client
sent the message (which in turn is used by analytics). For more
information, see [the webhook walkthrough](https://zulipchat.com/api/webhook-walkthrough).
information, see [the webhook walkthrough](webhook-walkthrough.html).

View File

@@ -103,7 +103,7 @@ this?". Good choices include
change being made. Tests that exclude whole classes of potential
bugs are preferred when possible (e.g., the common test suite
`test_bugdown.py` between the Zulip server's [frontend and backend
Markdown processors](../subsystems/markdown.html), or the `GetEventsTest` test for
Markdown processors](markdown.html), or the `GetEventsTest` test for
buggy race condition handling).
* *Translation.* Make sure that the strings are marked for
@@ -192,11 +192,11 @@ We also strongly recommend reviewers to go through the following resources.
* [Code Review - A consolidation of advice and stuff from the
sinternet](https://gist.github.com/porterjamesj/002fb27dd70df003646df46f15e898de)
article by James J. Porter
* [Zulip Code of Conduct](../code-of-conduct.html)
* [Zulip Code of Conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html)
[code-style]: ../contributing/code-style.html
[commit-messages]: ../contributing/version-control.html#commit-messages
[test-writing]: ../testing/testing.html
[mypy]: ../contributing/mypy.html
[git tool]: ../git/zulip-tools.html#fetch-a-pull-request-and-rebase
[translation]: ../translating/translating.html
[code-style]: code-style.html
[commit-messages]: version-control.html#commit-messages
[test-writing]: testing.html
[mypy]: mypy.html
[git tool]: git-guide.html#fetch-a-pull-request-and-rebase
[translation]: translating.html

View File

@@ -163,7 +163,7 @@ Don't use it:
### Translation tags
Remember to
[tag all user-facing strings for translation](../translating/translating.html), whether
[tag all user-facing strings for translation](translating.html), whether
they are in HTML templates or JavaScript editing the HTML (e.g. error
messages).
@@ -217,7 +217,7 @@ code a lot uglier, in which case it's fine to go up to 120 or so.
When calling a function with an anonymous function as an argument, use
this style:
my_function('foo', function (data) {
$.get('foo', function (data) {
var x = ...;
// ...
});
@@ -292,5 +292,5 @@ All significant new features should come with tests. See testing.
### Third party code
See [our docs on dependencies](../subsystems/dependencies.html) for discussion of
See [our docs on dependencies](dependencies.html) for discussion of
rules about integrating third-party projects.

View File

@@ -15,8 +15,8 @@
import sys
import os
import shlex
from typing import Any, Dict, List, Optional
if False:
from typing import Any, Dict, List, Optional
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
@@ -43,23 +43,18 @@ templates_path = ['_templates']
master_doc = 'index'
# General information about the project.
project = 'Zulip'
copyright = '2015-2018, The Zulip Team'
author = 'The Zulip Team'
project = u'Zulip'
copyright = u'2015-2017, The Zulip Team'
author = u'The Zulip Team'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '1.8'
version = '1.7'
# The full version, including alpha/beta/rc tags.
release = '1.8.0'
# This allows us to insert a warning that appears only on an unreleased
# version, e.g. to say that something is likely to have changed.
if release.endswith('+git'):
tags.add('unreleased')
release = '1.7.1'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
@@ -118,18 +113,11 @@ if not on_rtd:
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
else:
html_theme = 'sphinx_rtd_theme'
html_style = None
html_theme_options = {'collapse_navigation': False}
using_rtd_theme = True
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
html_theme_options = {
'collapse_navigation': False,
}
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
@@ -218,6 +206,12 @@ html_static_path = ['_static']
# Output file base name for HTML help builder.
htmlhelp_basename = 'zulip-contributor-docsdoc'
def setup(app):
# type: (Any) -> None
# overrides for wide tables in RTD theme
app.add_stylesheet('theme_overrides.css') # path relative to _static
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
@@ -238,8 +232,8 @@ latex_elements = {
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'zulip-contributor-docs.tex', 'Zulip Documentation',
'The Zulip Team', 'manual'),
(master_doc, 'zulip-contributor-docs.tex', u'Zulip Documentation',
u'The Zulip Team', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
@@ -268,7 +262,7 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'zulip-contributor-docs', 'Zulip Documentation',
(master_doc, 'zulip-contributor-docs', u'Zulip Documentation',
[author], 1)
]
@@ -282,7 +276,7 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'zulip-contributor-docs', 'Zulip Documentation',
(master_doc, 'zulip-contributor-docs', u'Zulip Documentation',
author, 'zulip-contributor-docs', 'Documentation for contributing to Zulip.',
'Miscellaneous'),
]
@@ -300,7 +294,6 @@ texinfo_documents = [
#texinfo_no_detailmenu = False
from recommonmark.parser import CommonMarkParser
from recommonmark.transform import AutoStructify
source_parsers = {
'.md': CommonMarkParser,
@@ -309,24 +302,3 @@ source_parsers = {
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
source_suffix = ['.rst', '.md']
def setup(app: Any) -> None:
app.add_config_value('recommonmark_config', {
'enable_eval_rst': True,
# Turn off recommonmark features we aren't using.
'enable_auto_doc_ref': False,
'auto_toc_tree_section': None,
'enable_auto_toc_tree': False,
'enable_math': False,
'enable_inline_math': False,
'url_resolver': lambda x: x,
}, True)
# Enable `eval_rst`, and any other features enabled in recommonmark_config.
# Docs: http://recommonmark.readthedocs.io/en/latest/auto_structify.html
# (But NB those docs are for master, not latest release.)
app.add_transform(AutoStructify)
# overrides for wide tables in RTD theme
app.add_stylesheet('theme_overrides.css') # path relative to _static

View File

@@ -1,17 +0,0 @@
#######################
Code Contribution Guide
#######################
.. toctree::
:maxdepth: 3
version-control
code-style
mypy
code-reviewing
chat-zulip-org
zulipbot-usage
accessibility
bug-reports
../code-of-conduct
summer-with-zulip

View File

@@ -1,285 +0,0 @@
# How to have an amazing summer with Zulip
The purpose of this doc is to provide advice to GSoC/ZSoC mentors and students
on how to make the summer as successful as possible. It's mandatory reading, in
addition to [Google's
materials](https://developers.google.com/open-source/gsoc/resources/manual).
- Don't focus too much on doing precisely what's in the project proposal or
following precisely that schedule. The goals are for students to learn and to
advance Zulip, not to do in July what we guessed would be the right plan in
March with limited information.
- We probably will want to create a Dropbox Paper document for each student to
keep track of the current version of their project plan, but make sure to
keep GitHub up to date with what issues you're working on.
- Claim issues using zulipbot only when you actually start work on them. And
if someone else fixes an issue you were planning to fix, don't worry about
it! It's great for Zulip that the project was finished, and there's plenty
of issues to work on :D. You can help review their work to build
your expertise in the subsystem you're working on.
- Look for, claim, and fix bugs to help keep Zulip polished. Bugs and polish
are usually more important to users than new features.
- Help test new features! It's fun, and one of the most valuable
ways one can contribute to any software project is finding bugs in
it before they reach a lot of users :).
- Participate and be helpful in the community! Helping a new Zulip server
administrator debug their installation problem or playing with the mobile
app until you can get something to break are great ways to contribute.
- Mentors and students should stay in close contact, both with each other and
the rest of the Zulip community. We recommend the following:
- Daily checkins on #checkins on chat.zulip.org; ideally at some time of day
you can both be online, but when not possible, async is better than nothing!
- We prefer checkins in public streams, since it makes easier for
other contributors to keep track of what everyone else is
working on and share ideas (and helps organization leadership
keep track of progress). Though, of course, feel free to have
much more involved/detailed discussions privately as well.
- If a mentor will be traveling or otherwise offline, mentors should make
sure another mentor is paying attention in the meantime.
- Video calls are great! Mentors should do 1-2 video calls with their students
calls per week, depending on length, schedules, and what's happening.
- Make sure to talk about not just the current project, but also meta-issues
like your development process, where things are getting stuck, skills you
need help learning, and time-saving tricks.
- If you need feedback from the community / decisions made, ask in the
appropriate public stream on [chat.zulip.org](http://chat.zulip.org). Often
someone can provide important context that you need to succeed in your
project.
- Communicate clearly, especially in public places! You'll get much more
useful feedback to a well-written Zulip message or GitHub issue comment than
one that is unclear.
- Be sure to mention any concerns you have with your own work!
- Talk with your mentor about the status of your various projects and where
they're stuck.
- And when you update your PR having addressed a set of review feedback, be
clear about which issues you've resolved (and how!) and
especially any that you haven't yet (this helps code reviewers
use their time well).
- Post screenshots and/or brief videos of UI changes; a picture can be worth
1000 words, especially for verifying whether a design change is
working as intended.
- Use #design and similar forums to get feedback on issues where we need
community consensus on what something should look like or how it
should work.
- Bring up problems early, whether technical or otherwise. If you
find you're stressed about something, mention it your mentor
immediately, so they can help you solve the problem. If you're
stressed about something involving your mentor, bring it up with
an organization admin.
- Join Zulip's GitHub teams that relate to your projects and/or interests, so
that you see new issues and PRs coming in that are relevant to your work.
You can browse the area teams here:
https://github.com/orgs/zulip/teams (You need to be a member of
the Zulip organization to see them; ask Tim for an invite if needed).
- Everyone's goal is to avoid students ending up blocked and feeling stuck.
There are lots of things that students can do (and mentors can help them to)
to avoid this:
- Get really good at using `git rebase -i` to produce a really clean
commit history that's fast to review. We occasionally do workshops
on how to do relatively complex rebases.
- Work on multiple parallelizable projects (or parts of projects) at a time.
This can help avoid being stuck while waiting for something to be reviewed.
- It can help to plan a bit in advance; if your next project requires some
UX decisions to be made with the community, start the conversation a few
days before you need an answer. Or do some preparatory refactoring that
will make the feature easier to complete and can be merged without making
all the decisions.
- Think about how to test your changes.
- Among your various projects, prioritize as follows:
- (1) Fixing regressions you introduced with recently merged work (and other
bugs you notice).
- (2) Responding to code review feedback and fixing your in-flight branches
over starting new work. Unmerged PRs develop painful merge conflicts
pretty quickly, so you'll do much less total work per feature if you're
responsive and try to make it easy for maintainers to merge your commits.
- (3) Do any relevant follow-ups to larger projects you've completed, to
make sure that you've left things better than how you found them.
- (4) Starting on the next project.
- Figure out a QA/testing process that works for you, and be sure to explain
in your PRs how you've tested your changes. Most of the time, in a large
open source project, is spent looking for and fixing regressions, and it
saves everyone time when bugs can be fixed before the code is reviewed, or
barring that, before it's merged.
- Plan (and if when planning fails, rebase) your branches until they are easy
to merge partially (i.e. merging just the first commit will not make Zulip
worse or break the tests). Ideally, when reviewing a branch of yours, the
maintainer should be able to merge the first few commits and leave comments
on the rest. This is by far the most efficient way to do collaborative
development, since one is constantly making progress, we keep branches
small, and developers don't end up reviewing the easily merged parts of a PR
repeatedly.
- Look at Steve Howell's closed PRs to get a feel for how to do this well
for even complex changes.
- Or Eklavya Sharma's (from GSoC 2016) to see a fellow GSoC student doing
this well. (`git log -p` `--``author=Eklavya` is a fast way to skim).
- Team up with other developers close to or in your time zone who are working
on similar areas to trade timely initial code reviews. 75% of the feedback
that the expert maintainers give is bugs/UI problems from clicking around,
lack of tests, or code clarity issues that anyone else in the project should
be able to point out. Doing this well can save a lot of round-trips.
- Help with code review! Reviewing others' changes is one of the best ways to
learn to be a better developer, since you'll both see how others solve
problems and also practice the art of catching bugs in unfamiliar code.
- It's best to start with areas where you know the surrounding code
and expertise, but don't be afraid to open up the code in your
development environment and read it rather than trying to
understand everything from the context GitHub will give you. Even
Tim reads surrounding code much of the time when reviewing things,
and so should you :).
- It's OK to review something that's already been reviewed or just post a
comment on one thing you noticed in a quick look!
- Even posting a comment that you tried a PR and it worked in your development
environment is valuable; you'll save the next reviewer a bit of time
verifying that.
- If you're confused by some code, usually that's because the code is
confusing, not because you're not smart enough. So speak up when you notice
this! Very frequently, this is a sign that we need to write more
docs/comments or (better, if possible!) to make the code more
self-explanatory.
- Plan your approach to larger projects. Usually, when tackling something big,
there's a few phases you want to go through:
- Studying the subsystem, reading its docs, etc., to get a feel for how things
work. Often a good approach is to fix some small bugs in the area to warm
your knowledge up.
- Figure out how you'll test your work feature, both manually and via
automated tests. For some projects, can save a lot of hours by doing a bit
of pre-work on test infrastructure or `populate_db` initial data
to make it easy for both you and code reviewers to get the state
necessary to test a feature.
- Make a plan for how to create a series of small (<100LOC) commits that are
each safely mergable and move you towards your goal. Often this ends up
happening through first doing a hacky attempt to hooking together the
feature, with reading and print statements as part of the effort, to
identify any refactoring needed or tests you want to write to help make sure
your changes won't break anything important as you work. Work out a fast and
consistent test procedure for how to make sure the feature is working as
planned.
- Do the prerequisite test/refactoring/etc. work, and get those changes
merged.
- Build a mergeable version of the feature on top of those refactorings.
Whenever possible, find chunks of complexity that you can separate from the
rest of the project.
- Spend time every week thinking about what could make contributing to Zulip
easier for both yourself and the next generation of Zulip developers. And then
make those ideas reality!
- Have fun! Spending your summer coding on open source is an amazing life
opportunity, and we hope you'll have a blast. With some luck and hard work,
your contributions to the open source world this summer will be something you
can be proud of for the rest of your life.
## What makes a successful summer
Success for the student means a few things, in order of importance:
- Mastery of the skills needed to be a self-sufficient and effective open source
developer. Ideally, by the end of the summer, most of the student's PRs should
go through only a couple rounds of code review before being merged, both in
Zulip and in any future open source projects they choose to join.
Our most successful students end up as the maintainer for one or
more areas within Zulip.
- The student has become a valued member of the Zulip community, and has made
the Zulip community a better place through their efforts. Reviewing PRs,
helping others debug, providing feedback, and finding bugs are all essential
ways to contribute beyond the code in your own project.
- Zulip becoming significantly better in the areas the student focused on. The
area should feel more polished, and have several new major features the
student has implemented. That section of code should be more readable,
better-tested, and have clearer documentation.
## Extra notes for mentors
- You're personally accountable for your student having a successful summer. If
you get swamped and find you don't have enough time, tell the org admins so
that we can make sure someone is covering for you. Yes, it sucks when you
can't do what you signed up for, but even worse is to not tell anyone and thus
prevent the project from finding a replacement.
- Mentors are expected to provide on the mentors stream a **brief report
weekly** on (1) how your students' projects are going, (2) what (if anything)
you're worried about, and (3) what new things you'd like to try this week to
help your student. A great time to do this is after a weekly scheduled call
with your student, while your recollection of the state is fresh.
- Timely feedback is more important than complete feedback, so get a fast
feedback cadence going with your student. It's amazing how useful just 5
minutes of feedback can be. Pay attention to the relative timezones; if you
plan it, you can get several round trips in per day even with big timezone
differences like USA + India.
- What exactly you focus on in your mentorship will vary from week to week and
depend somewhat on what the student needs. It might be any combination of
these things:
- Helping the student plan, chunk, and prioritize their work.
- Manually testing UI changes and helping find bugs.
- Doing code review of your student's work
- Providing early feedback on visual and technical design questions.
- Helping the student figure out how to test their changes.
- Helping the student break their PRs into reviewing chunks.
- Making sure busy maintainers like Tim Abbott provide any necessary feedback
so that the student's project doesn't get stuck.
- Helping with the technical design of projects and making sure they're aware
of useful and relevant reference materials.
- Pair programming with the student to help make sure you share useful tricks.
- Emotional support when things feel like they aren't going well.

View File

@@ -33,7 +33,7 @@ document:
This document focuses almost entirely on the **export** piece. Issues
with getting Zulip itself running are out of scope here; see [the
production installation instructions](../index.html#zulip-in-production).
production installation instructions](index.html#zulip-in-production).
As for the import side of things, we only touch on it implicitly. (My
reasoning was that we *had* to get the export piece right in a timely
fashion, even if it meant we would have to sort out some straggling
@@ -116,9 +116,9 @@ process the data, which isn't surprising for a top-down approach.)
The next section of the document talks about risk factors.
## Risk Mitigation
# Risk Mitigation
### Generic considerations
## Generic considerations
We have two major mechanisms for getting data:
@@ -144,9 +144,9 @@ duplicating some work, particularly on the message side of things.
We have not yet integrated the approved-transfer model, which tells us
which users can be moved.
### Risk factors broken out by data categories
## Risk factors broken out by data categories
#### Message Data
### Message Data
- models: `Message`/`UserMessage`.
- assets: `messages-*.json`, subprocesses, partial files
@@ -165,7 +165,7 @@ We currently have these measures in place for top-down processing:
- messages are filtered by both sender and recipient
#### File Related Data
### File Related Data
- models: `Attachment`
- assets: S3, `attachment.json`, `uploads-temp/`, image files in
@@ -185,7 +185,7 @@ parts**:
- At import time we have to populate the `m2m` table (but fortunately,
this is pretty low risk in terms of breaking anything.)
#### Recipient Data
### Recipient Data
- models: `Recipient/Stream/Subscription/Huddle`
- assets: `realm.json`, `(user,stream,huddle)_(recipient,subscription)`
@@ -219,7 +219,7 @@ Recommendation: We probably want to get a backup of all this data that
is very simply bulk-exported from the entire DB, and we should
obviously put it in a secure place.
#### Cross Realm Data
### Cross Realm Data
- models: `Client`
- assets: `realm.json`, three bots (`notification`/`email`/`welcome`),
`id_maps`
@@ -245,7 +245,7 @@ example. As for possibly missing messages that the welcome bot and
friends have sent in the past, I am not sure what our risk profile is
there, but I imagine it is relatively low.
#### Disjoint User Data
### Disjoint User Data
- models: `UserProfile/UserActivity/UserActivityInterval/UserPresence`
- assets: `realm.json`, `password`, `api_key`, `avatar salt`,
`id_maps`
@@ -259,7 +259,7 @@ We have code in place to exclude `password` and `api_key` from
`UserProfile` rows. The import process calls
`set_unusable_password()`.
#### Public Realm Data
### Public Realm Data
- models: `Realm/RealmDomain/RealmEmoji/RealmFilter/DefaultStream`
- asserts: `realm.json`

View File

@@ -177,7 +177,7 @@ you basically have to solve these problems:
Zulip actually supports a bunch of integrations out-of-the-box that
perform as **World Readers**.
The [three different integration models](https://zulipchat.com/api/integration-guide#types-of-integrations)
The [three different integration models](integration-guide.html#types-of-integrations)
basically differ in where they perform the main functions of a
**World Reader**.

View File

@@ -74,12 +74,12 @@ the backend, but does in JavaScript.
For the third-party services like Postgres, Redis, Nginx, and RabbitMQ
that are documented in the
[architecture overview](../overview/architecture-overview.html), we rely on the
[architecture overview](architecture-overview.html), we rely on the
versions of those packages provided alongside the Linux distribution
on which Zulip is deployed. Because Zulip
[only supports Ubuntu in production](../production/requirements.html), this
[only supports Ubuntu in production](prod-requirements.html), this
usually means `apt`, though we do support
[other platforms in development](../development/setup-advanced.html). Since
[other platforms in development](dev-setup-non-vagrant.html). Since
we don't control the versions of these dependencies, we avoid relying
on specific versions of these packages wherever possible.
@@ -117,8 +117,8 @@ highlighting. The system is largely managed by the code in
versions in a `requirements.txt` file to declare what we're using.
Since we have a few different installation targets, we maintain
several `requirements.txt` format files in the `requirements/`
directory (e.g. `dev.in` for development, `prod.in` for
production, `docs.in` for ReadTheDocs, `common.in` for the vast
directory (e.g. `dev.txt` for development, `prod.txt` for
production, `docs.txt` for ReadTheDocs, `common.txt` for the vast
majority of packages common to prod and development, etc.). We use
`pip install --no-deps` to ensure we only install the packages we
explicitly declare as dependencies.
@@ -133,23 +133,6 @@ highlighting. The system is largely managed by the code in
effect is that it's easy to debug problems caused by dependency
upgrades, since we're always doing those upgrades with an explicit
commit updating the `requirements/` directory.
* **Pinning versions of indirect dependencies**. We "pin" or "lock"
the versions of our indirect dependencies files with
`tools/update-locked-requirements` (powered by `pip-compile`). What
this means is that we have some "source" requirements files, like
`requirements/common.in`, that declare the packages that Zulip
depends on directly. Those packages have their own recursive
dependencies. When adding or removing a dependency from Zulip, one
simply edits the appropriate "source" requirements files, and then
runs `tools/update-locked-requirements`. That tool will use `pip
compile` to generate the locked requirements files like `prod.txt`,
`dev.txt` etc files that explicitly declare versions of all of
Zulip's recursive dependencies. For indirect dependencies (i.e.
dependencies not explicitly declared in the source requirements files),
it provides helpful comments explaining which direct dependency (or
dependencies) needed that indirect dependency. The process for
using this system is documented in more detail in
`requirements/README.md`.
* **Caching of virtualenvs and packages**. To make updating the
dependencies of a Zulip installation efficient, we maintain a cache
of virtualenvs named by the hash of the relevant `requirements.txt`
@@ -170,6 +153,23 @@ highlighting. The system is largely managed by the code in
production deployment directory under `/home/zulip/deployments/`.
This helps ensure that a Zulip installation doesn't leak large
amounts of disk over time.
* **Pinning versions of indirect dependencies**. We "pin" or "lock"
the versions of our indirect dependencies files with
`tools/update-locked-requirements` (powered by `pip-compile`). What
this means is that we have some "source" requirements files, like
`requirements/common.txt`, that declare the packages that Zulip
depends on directly. Those packages have their own recursive
dependencies. When adding or removing a dependency from Zulip, one
simply edits the appropriate "source" requirements files, and then
runs `tools/update-locked-requirements`. That tool will use `pip
compile` to generate the `prod_lock.txt` and `dev_lock.txt` files
that explicitly declare versions of all of Zulip's recursive
dependencies. For indirect dependencies (i.e. dependencies not
explicitly declared in the source requirements files), it provides
helpful comments explaining which direct dependency (or
dependencies) needed that indirect dependency. The process for
using this system is documented in more detail in
`requirements/README.md`.
* **Scripts**. Often, we want a script running in production to use
the Zulip virtualenv. To make that work without a lot of duplicated
code, we have a helpful library,
@@ -197,7 +197,7 @@ reasoning here.
dependencies in the `yarn.lock` file; `yarn upgrade` updates the
`yarn.lock` files.
* `tools/update-prod-static`. This process is discussed in detail in
the [static asset pipeline](../subsystems/front-end-build-process.html) article,
the [static asset pipeline](front-end-build-process.html) article,
but we don't use the `node_modules` directories directly in
production. Instead, static assets are compiled using our static
asset pipeline and it is the compiled assets that are served
@@ -241,7 +241,7 @@ Zulip uses the [iamcal emoji data package][iamcal] for its emoji data
and sprite sheets. We download this dependency using `npm`, and then
have a tool, `tools/setup/build_emoji`, which reformats the emoji data
into the files under `static/generated/emoji`. Those files are in
turn used by our [markdown processor](../subsystems/markdown.html) and
turn used by our [markdown processor](markdown.html) and
`tools/update-prod-static` to make Zulip's emoji work in the various
environments where they need to be displayed.
@@ -256,7 +256,7 @@ files and a few large ones. There is a more extended article on our
### Translations data
Zulip's [translations infrastructure](../translating/translating.html) generates
Zulip's [translations infrastructure](translating.html) generates
several files from the source data, which we manage similar to our
emoji, but without the caching (and thus without the
garbage-collection). New translations data is downloaded from
@@ -288,7 +288,7 @@ usually one needs to think about making changes in 3 places:
* `tools/lib/provision.py`. This is the main provisioning script,
used by most developers to maintain their development environment.
* `docs/development/dev-setup-non-vagrant.md`. This is our "manual installation"
* `docs/dev-setup-non-vagrant.md`. This is our "manual installation"
documentation. Strategically, we'd like to move the support for more
versions of Linux from here into `tools/lib/provision.py`.
* Production. Our tools for compiling/generating static assets need

View File

@@ -24,7 +24,7 @@ environment,** check
[Troubleshooting and Common Errors](#troubleshooting-and-common-errors). If
that doesn't help, please visit
[#provision help](https://chat.zulip.org/#narrow/stream/provision.20help)
in the [Zulip development community server](../contributing/chat-zulip-org.html) for
in the [Zulip development community server](chat-zulip-org.html) for
real-time help, send a note to the
[Zulip-devel Google group](https://groups.google.com/forum/#!forum/zulip-devel)
or [file an issue](https://github.com/zulip/zulip/issues).
@@ -46,9 +46,9 @@ proxy](#specifying-a-proxy) if you need a proxy to access the internet.)
- **All**: 2GB available RAM, Active broadband internet connection, [GitHub account][set-up-git].
- **macOS**: macOS (10.11 El Capitan or newer recommended), Git,
VirtualBox (version [5.2.6][vbox-dl-macos] recommended -- we find
it's more stable than more recent versions),
- **macOS**: macOS (10.11 El Capitan or 10.12 Sierra recommended),
Git, VirtualBox (version [5.1.8][vbox-dl-macos]
recommended -- we find it's more stable than more recent versions),
[Vagrant][vagrant-dl-macos].
- **Ubuntu**: 14.04 64-bit or 16.04 64-bit, Git, [Vagrant][vagrant-dl-deb], lxc.
- or **Debian**: 9.0 "stretch" 64-bit
@@ -82,12 +82,8 @@ Jump to:
#### macOS
0. If you are running MacOS High Sierra, make sure you are not running
a version with a
[buggy NFS implementation](#importerror-no-module-named-on-macos-during-vagrant-provisioning).
Versions 10.13.2 and above have the bug fixed.
1. Install [Vagrant][vagrant-dl-macos] (2.0.2).
2. Install [VirtualBox][vbox-dl-macos] (5.2.6).
1. Install [Vagrant][vagrant-dl-macos] (1.8.4-1.8.6, do not use 1.8.7).
2. Install [VirtualBox][vbox-dl-macos] (5.1.8).
(For a non-free option, but better performance, you can also use [VMWare
Fusion][vmware-fusion-dl] with the [VMWare Fusion Vagrant
@@ -104,7 +100,7 @@ after which you can jump to [Step 2: Get Zulip Code](#step-2-get-zulip-code):
```
sudo apt-get -y purge vagrant && \
wget https://releases.hashicorp.com/vagrant/2.0.2/vagrant_2.0.2_x86_64.deb && \
wget https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6_x86_64.deb && \
sudo dpkg -i vagrant*.deb && \
sudo apt-get -y install build-essential git ruby lxc lxc-templates cgroup-lite redir && \
vagrant plugin install vagrant-lxc && \
@@ -126,11 +122,11 @@ christie@ubuntu-desktop:~
$ sudo apt-get purge vagrant
```
Now download and install the .deb package for [Vagrant 2.0.2][vagrant-dl-deb]:
Now download and install the .deb package for [Vagrant 1.8.6][vagrant-dl-deb]:
```
christie@ubuntu-desktop:~
$ wget https://releases.hashicorp.com/vagrant/2.0.2/vagrant_2.0.2_x86_64.deb
$ wget https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6_x86_64.deb
christie@ubuntu-desktop:~
$ sudo dpkg -i vagrant*.deb
@@ -174,36 +170,35 @@ Now you are ready for [Step 2: Get Zulip Code.](#step-2-get-zulip-code)
#### Debian
The setup for Debian 9.0 "stretch" is very similar to that
[for Ubuntu 16.04 above](#ubuntu). Follow those instructions,
except with the following differences:
The setup for Debian 9.0 "stretch" is just like [for Ubuntu 16.04](#ubuntu),
with one difference.
**Apt package list**. In "2. Install remaining dependencies", the
command to install the dependencies is a bit shorter:
If you're in a hurry, you can copy and paste the following into your terminal
after which you can jump to [Step 2: Get Zulip Code](#step-2-get-zulip-code):
```
christie@ubuntu-desktop:~
$ sudo apt-get install build-essential git ruby lxc redir
sudo apt-get -y purge vagrant && \
wget https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6_x86_64.deb && \
sudo dpkg -i vagrant*.deb && \
sudo apt-get -y install build-essential git ruby lxc redir && \
vagrant plugin install vagrant-lxc && \
vagrant lxc sudoers
```
**Set up LXC networking**. After completing "2. Install remaining
dependencies", you will have to set up networking for LXC containers,
because Debian's packaging for LXC does not ship any default
network setup for them. You can do this by
[following the steps][lxc-networking-quickstart] outlined in
[Debian's LXC docs](https://wiki.debian.org/LXC#network_setup).
For a step-by-step explanation, follow the [Ubuntu instructions above](#ubuntu),
with the following difference: in "2. Install remaining dependencies", the
command is
[lxc-networking-quickstart]: https://wiki.debian.org/LXC#Minimal_changes_to_set_up_networking_for_LXC_for_Debian_.2BIBw-stretch.2BIB0_.28testing.29
```
sudo apt-get install build-essential git ruby lxc redir
```
Then return to the next step in the Ubuntu instructions above. After
finishing those steps, you will be ready for
[Step 2: Get Zulip Code](#step-2-get-zulip-code).
#### Windows 10
1. Install [Git for Windows][git-bash], which installs *Git BASH*.
2. Install [VirtualBox][vbox-dl] (version == 5.2.6).
3. Install [Vagrant][vagrant-dl-win] (version 2.0.2, do not use 1.8.7).
2. Install [VirtualBox][vbox-dl] (version >= 5.1.6).
3. Install [Vagrant][vagrant-dl-win] (version 1.8.4-1.8.6, do not use 1.8.7).
(Note: While *Git BASH* is recommended, you may also use [Cygwin][cygwin-dl].
If you do, make sure to **install default required packages** along with
@@ -275,13 +270,9 @@ Now you are ready for [Step 2: Get Zulip Code.](#step-2-get-zulip-code)
do this.
2. Open Terminal (macOS/Ubuntu) or Git BASH (Windows; must
**run as an Administrator**).
3. In Terminal/Git BASH,
[clone your fork of the Zulip repository](../git/cloning.html#step-1b-clone-to-your-machine)
and [connect the Zulip upstream repository](../git/cloning.html#step-1c-connect-your-fork-to-zulip-upstream):
3. In Terminal/Git BASH, clone your fork:
```
git clone --config pull.rebase git@github.com:YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
git clone git@github.com:YOURUSERNAME/zulip.git
```
This will create a 'zulip' directory and download the Zulip code into it.
@@ -291,7 +282,7 @@ something like:
```
christie@win10 ~
$ git clone --config pull.rebase git@github.com:YOURUSERNAME/zulip.git
$ git clone git@github.com:YOURUSERNAME/zulip.git
Cloning into 'zulip'...
remote: Counting objects: 73571, done.
remote: Compressing objects: 100% (2/2), done.
@@ -330,7 +321,7 @@ does the following:
downloads all required dependencies, sets up the python environment for
the Zulip development server, and initializes a default test
database. We call this process "provisioning", and it is documented
in some detail in our [dependencies documentation](../subsystems/dependencies.html).
in some detail in our [dependencies documentation](dependencies.html).
You will need an active internet connection during the entire
process. (See [Specifying a proxy](#specifying-a-proxy) if you need a
@@ -342,7 +333,7 @@ documented in the
[Troubleshooting and Common Errors](#troubleshooting-and-common-errors)
section. If that doesn't help, please visit
[#provision help](https://chat.zulip.org/#narrow/stream/provision.20help)
in the [Zulip development community server](../contributing/chat-zulip-org.html) for
in the [Zulip development community server](chat-zulip-org.html) for
real-time help.
On Windows, you will see `The system cannot find the path specified.` message
@@ -436,7 +427,7 @@ navigating to <http://localhost:9991/> in the browser on your main machine.
You should see something like this:
![Image of Zulip development environment](../images/zulip-dev.png)
![Image of Zulip development environment](images/zulip-dev.png)
The Zulip server will continue to run and send output to the terminal window.
When you navigate to Zulip in your browser, check your terminal and you
@@ -479,7 +470,7 @@ It's good to have the terminal running `run-dev.py` up as you work since error
messages including tracebacks along with every backend request will be printed
there.
See [Logging](../subsystems/logging.html) for further details on the run-dev.py console
See [Logging](logging.html) for further details on the run-dev.py console
output.
#### Committing and pushing changes with git
@@ -508,7 +499,7 @@ After provisioning, you'll want to
If you run into any trouble, the
[#provision help](https://chat.zulip.org/#narrow/stream/provision.20help)
in the [Zulip development community server](../contributing/chat-zulip-org.html) for
in the [Zulip development community server](chat-zulip-org.html) for
is a great place to ask for help.
#### Rebuilding the development environment
@@ -604,7 +595,7 @@ If these solutions aren't working for you or you encounter an issue not
documented below, there are a few ways to get further help:
* Ask in [#provision help](https://chat.zulip.org/#narrow/stream/provision.20help)
in the [Zulip development community server](../contributing/chat-zulip-org.html),
in the [Zulip development community server](chat-zulip-org.html),
* send a note to the [Zulip-devel Google
group](https://groups.google.com/forum/#!forum/zulip-devel), or
* [File an issue](https://github.com/zulip/zulip/issues).
@@ -662,8 +653,8 @@ macOS.
On **macOS** this error is most likely to occur with Vagrant version 1.8.7 and
is a [known issue](https://github.com/mitchellh/vagrant/issues/7997).
The solution is to downgrade Vagrant to version 2.0.2 ([available
here](https://releases.hashicorp.com/vagrant/2.0.2/)), or to use your system's
The solution is to downgrade Vagrant to version 1.8.6 ([available
here](https://releases.hashicorp.com/vagrant/1.8.6/)), or to use your system's
version of `curl` instead of the one that ships with Vagrant:
```
@@ -722,33 +713,6 @@ This is equivalent of running a halt followed by an up (aka rebooting
the guest). After this, you can do `vagrant provision` and `vagrant
ssh`.
#### ssl read error
If you receive the following error while running `vagrant up`:
```
SSL read: error:00000000:lib(0):func(0):reason(0), errno 104
```
It means that either your network connection is unstable and/or very
slow. To resolve it, run `vagrant up` until it works (possibly on a
better network connection).
#### Unmet dependencies error
When running `vagrant up` or `provision`, if you see the following error:
```
==> default: E:unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution).
```
It means that your local apt repository has been corrupted, which can
usually be resolved by executing the command:
```
apt-get -f install
```
#### ssh connection closed by remote host
On running `vagrant ssh`, if you see the following error:
@@ -914,7 +878,7 @@ Likely causes are:
1. Networking issues
2. Insufficient RAM. Check whether you've allotted at least two
gigabytes of RAM, which is the minimum Zulip
[requires](../development/setup-vagrant.html#requirements). If
[requires](dev-env-first-time-contributors.html#requirements). If
not, go to your VM settings and increase the RAM, then restart
the VM.
@@ -992,58 +956,10 @@ christie@xenial:~
$ sudo patch --directory /usr/lib/ruby/vendor_ruby/vagrant < vagrant-plugin.patch
patching file bundler.rb
```
#### VT-X unavailability error
Users who are unable to do "vagrant up" due to a VT-X unavailability error need to disable "Hyper-V" to get it to work.
#### Permissions errors when running the test suite in LXC
See ["Possible testing issues"](../testing/testing.html#possible-testing-issues).
#### ImportError: No module named '...' on MacOS during Vagrant provisioning
If you see following error (or similar) when you try to provision
Vagrant environment by `vagrant provision` (or during first run
`vagrant up`):
```
default: ImportError: No module named 'zerver.lib.emoji'
default: Error running a subcommand of ./lib/provision.py: tools/do-destroy-rebuild-database
default: Actual error output for the subcommand is just above this.
default: Traceback (most recent call last):
default: File "./lib/provision.py", line 413, in <module>
default: sys.exit(main(options))
default: File "./lib/provision.py", line 349, in main
default: run(["tools/do-destroy-rebuild-database"])
default: File "/srv/zulip/scripts/lib/zulip_tools.py", line 163, in run
default: subprocess.check_call(args, **kwargs)
default: File "/usr/lib/python3.4/subprocess.py", line 561, in check_call
default: raise CalledProcessError(retcode, cmd)
default: subprocess.CalledProcessError: Command '['tools/do-destroy-rebuild-database']' returned non-zero exit status 1
default:
default: Provisioning failed!
default: * Look at the traceback(s) above to find more about the errors.
default: * Resolve the errors or get help on chat.
default: * If you can fix this yourself, you can re-run tools/provision at any time.
default: * Logs are here: zulip/var/log/provision.log
default:
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
```
This error is caused by a bug in the MacOS NFS file syncing
implementation (Zulip uses Vagrant's NFS feature for syncing files on
MacOS). In early versions of MacOS High Sierra, files present in the
directory on the host machine would appear to not be present in the
Vagrant guest (e.g. in the exception above, `zerver/lib/emoji.py` is
missing). This bug is fixed in MacOS High Sierra 10.13.2 and above,
so the fix is to upgrade to a version of MacOS with a working NFS
implementation.
You can read more about this
[here](https://github.com/hashicorp/vagrant/issues/8788).
See ["Possible testing issues"](testing.html#possible-testing-issues).
### Specifying a proxy
@@ -1096,22 +1012,22 @@ for the IP address that means any IP address can connect to your development ser
[cygwin-dl]: http://cygwin.com/
[vagrant-dl]: https://www.vagrantup.com/downloads.html
[vagrant-dl-win]: https://releases.hashicorp.com/vagrant/2.0.2/vagrant_2.0.2_x86_64.msi
[vagrant-dl-macos]: https://releases.hashicorp.com/vagrant/2.0.2/vagrant_2.0.2_x86_64.dmg
[vagrant-dl-deb]: https://releases.hashicorp.com/vagrant/2.0.2/vagrant_2.0.2_x86_64.deb
[vagrant-dl-win]: https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6.msi
[vagrant-dl-macos]: https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6.dmg
[vagrant-dl-deb]: https://releases.hashicorp.com/vagrant/1.8.6/vagrant_1.8.6_x86_64.deb
[vagrant-lxc]: https://github.com/fgrehm/vagrant-lxc
[vbox-dl]: https://www.virtualbox.org/wiki/Downloads
[vbox-dl-macos]: https://download.virtualbox.org/virtualbox/5.2.6/VirtualBox-5.2.6-120293-OSX.dmg
[vbox-dl-macos]: http://download.virtualbox.org/virtualbox/5.1.8/VirtualBox-5.1.8-111374-OSX.dmg
[vmware-fusion-dl]: http://www.vmware.com/products/fusion.html
[vagrant-vmware-fusion-dl]: https://www.vagrantup.com/vmware/
[avoiding-sudo]: https://github.com/fgrehm/vagrant-lxc#avoiding-sudo-passwords
[install-advanced]: ../development/setup-advanced.html
[install-advanced]: dev-setup-non-vagrant.html
[lxc-sf]: https://github.com/fgrehm/vagrant-lxc/wiki/FAQ#help-my-shared-folders-have-the-wrong-owner
[rtd-git-guide]: ../git/index.html
[rtd-testing]: ../testing/testing.html
[rtd-using-dev-env]: using.html
[rtd-dev-remote]: remote.html
[rtd-git-guide]: git-guide.html
[rtd-testing]: testing.html
[rtd-using-dev-env]: using-dev-environment.html
[rtd-dev-remote]: dev-remote.html
[git-bash]: https://git-for-windows.github.io/
[bash-admin-setup]: https://superuser.com/questions/1002262/run-applications-as-administrator-by-default-in-windows-10
[set-up-git]: ../git/setup.html
[travis-ci]: ../git/cloning.html#step-3-configure-travis-ci-continuous-integration
[set-up-git]: git-guide.html#set-up-git
[travis-ci]: git-guide.html#step-3-configure-travis-ci-continuous-integration

View File

@@ -82,14 +82,14 @@ And if you've setup the Zulip development environment on a remote
machine, take a look at our tips for
[developing remotely][dev-remote].
[dev-remote]: remote.html
[install-direct]: ../development/setup-advanced.html#installing-directly-on-ubuntu
[install-docker]: ../development/setup-advanced.html#using-docker-experimental
[install-generic]: ../development/setup-advanced.html#installing-manually-on-linux
[install-vagrant]: ../development/setup-vagrant.html
[dev-remote]: dev-remote.html
[install-direct]: dev-setup-non-vagrant.html#installing-directly-on-ubuntu
[install-docker]: dev-setup-non-vagrant.html#using-docker-experimental
[install-generic]: dev-setup-non-vagrant.html#installing-manually-on-linux
[install-vagrant]: dev-env-first-time-contributors.html
[self-install-remote]: #installing-remotely
[self-slow-internet]: #slow-internet-connections
[configure-proxy]: ../development/setup-vagrant.html#specifying-a-proxy
[using-dev-env]: using.html
[testing]: ../testing/testing.html
[travis-ci]: ../git/cloning.html#step-3-configure-travis-ci-continuous-integration
[configure-proxy]: dev-env-first-time-contributors.html#specifying-a-proxy
[using-dev-env]: using-dev-environment.html
[testing]: testing.html
[travis-ci]: git-guide.html#step-3-configure-travis-ci-continuous-integration

View File

@@ -41,19 +41,10 @@ the remote virtual machine, we recommend installing
[Vagrant][install-vagrant] method so you can easily uninstall if you
need to.
The main difference from the standard instructions is that for a
remote development environment, you'll need to run `export
EXTERNAL_HOST=<REMOTE_IP>:9991` in a shell before running `run-dev.py`
(and see also the `--interface=''` option documented below). If your
server has a static IP address, we recommend putting this command in
`~/.bashrc`, so you don't need to remember to run it every time. This
allows you to access Zulip running in your development environment
using a browser on another host.
## Running the development server
Once you have set up the development environment, you can start up the
development server with the following command in the directory where
development instance of Zulip with the following command in the directory where
you cloned Zulip:
```
@@ -65,12 +56,7 @@ navigate to `http://<REMOTE_IP>:9991` and you should see something like
this screenshot of the Zulip development environment:
![Image of Zulip development
environment](../images/zulip-dev.png)
The `--interface=''` command makes the Zulip development environment
accessible from any IP address (in contrast with the more secure
default of only being accessible from localhost, which is great for
developing on your laptop).
environment](images/zulip-dev.png)
You can [port
forward](https://help.ubuntu.com/community/SSH/OpenSSH/PortForwarding) using
@@ -101,7 +87,7 @@ don't have a favorite, here are some suggestions:
* [spacemacs](https://github.com/syl20bnr/spacemacs)
* [sublime](https://www.sublimetext.com/)
Next, follow our [Git and GitHub Guide](../git/index.html) to clone and configure
Next, follow our [Git and GitHub Guide](git-guide.html) to clone and configure
your fork of zulip on your local computer.
Once you have cloned your code locally, you can get to work.
@@ -186,13 +172,13 @@ Next, read the following to learn more about developing for Zulip:
* [Using the Development Environment][rtd-using-dev-env]
* [Testing][rtd-testing]
[install-direct]: ../development/setup-advanced.html#installing-directly-on-ubuntu
[install-generic]: ../development/setup-advanced.html#installing-manually-on-linux
[install-vagrant]: ../development/setup-vagrant.html
[rtd-git-guide]: ../git/index.html
[rtd-using-dev-env]: using.html
[rtd-testing]: ../testing/testing.html
[install-direct]: dev-setup-non-vagrant.html#installing-directly-on-ubuntu
[install-generic]: dev-setup-non-vagrant.html#installing-manually-on-linux
[install-vagrant]: dev-env-first-time-contributors.html
[rtd-git-guide]: git-guide.html
[rtd-using-dev-env]: using-dev-environment.html
[rtd-testing]: testing.html
[git-bash]: https://git-for-windows.github.io/
[codeanywhere]: https://codeanywhere.com/
[img-ca-settings]: ../images/codeanywhere-settings.png
[img-ca-workspace]: ../images/codeanywhere-workspace.png
[img-ca-settings]: images/codeanywhere-settings.png
[img-ca-workspace]: images/codeanywhere-workspace.png

View File

@@ -1,21 +1,15 @@
# Advanced Setup (non-Vagrant)
# Zulip development environment setup without Vagrant
Contents:
* [Installing directly on Ubuntu](#installing-directly-on-ubuntu)
* [Installing manually on Linux](#installing-manually-on-linux)
* [Installing directly on cloud9](#installing-on-cloud9)
* [Using Docker (experimental)](#using-docker-experimental)
## Installing directly on Ubuntu
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
If you'd like to install a Zulip development environment on a computer
that's already running Ubuntu 16.04 Xenial or Ubuntu 14.04 Trusty, you
@@ -34,7 +28,7 @@ development environment).
Once you've done the above setup, you can pick up the [documentation
on using the Zulip development
environment](../development/setup-vagrant.html#step-4-developing),
environment](dev-env-first-time-contributors.html#step-4-developing),
ignoring the parts about `vagrant` (since you're not using it).
## Installing manually on Linux
@@ -67,13 +61,8 @@ Install the following non-Python dependencies:
#### Using the official Ubuntu repositories, PGroonga PPA and `tsearch-extras` deb package:
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
sudo apt-get install closure-compiler libfreetype6-dev libffi-dev \
@@ -89,8 +78,6 @@ sudo apt-get update
sudo apt-get install postgresql-9.3-pgroonga
# On 16.04
sudo apt-get install postgresql-9.5-pgroonga
# On 17.04 or 17.10
sudo apt-get install postgresql-9.6-pgroonga
# If using Debian, follow the instructions here: http://pgroonga.github.io/install/debian.html
@@ -120,7 +107,7 @@ sudo dpkg -i postgresql-9.4-tsearch-extras_0.1_amd64.deb
# If on 16.04 or stretch
wget https://launchpad.net/~tabbott/+archive/ubuntu/zulip/+files/postgresql-9.5-tsearch-extras_0.2_amd64.deb
sudo dpkg -i postgresql-9.5-tsearch-extras_0.3_amd64.deb
sudo dpkg -i postgresql-9.5-tsearch-extras_0.2_amd64.deb
```
Alternatively, you can always build the package from [tsearch-extras
@@ -132,13 +119,8 @@ Now continue with the [All Systems](#all-systems) instructions below.
[zulip-ppa]: https://launchpad.net/~tabbott/+archive/ubuntu/zulip/+packages
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
sudo add-apt-repository ppa:tabbott/zulip
@@ -157,13 +139,8 @@ Now continue with the [All Systems](#all-systems) instructions below.
These instructions are experimental and may have bugs; patches
welcome!
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
sudo dnf install libffi-devel memcached rabbitmq-server \
@@ -179,13 +156,8 @@ Now continue with the [Common to Fedora/CentOS](#common-to-fedora-centos-instruc
These instructions are experimental and may have bugs; patches
welcome!
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
# Add user zulip to the system (not necessary if you configured zulip
@@ -218,7 +190,7 @@ sudo yum install libffi-devel memcached rabbitmq-server openldap-devel \
sudo yum groupinstall "Development Tools"
# clone Zulip's git repo and cd into it
cd && git clone --config pull.rebase https://github.com/zulip/zulip && cd zulip/
cd && git clone https://github.com/zulip/zulip && cd zulip/
## NEEDS TESTING: The next few DB setup items may not be required at all.
# Initialize the postgres db
@@ -242,13 +214,8 @@ Now continue with the [Common to Fedora/CentOS](#common-to-fedora-centos-instruc
These instructions are experimental and may have bugs; patches
welcome!
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
doas pkg_add sudo bash gcc postgresql-server redis rabbitmq \
@@ -281,13 +248,8 @@ Finally continue with the [All Systems](#all-systems) instructions below.
### Common to Fedora/CentOS instructions
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
```
# Build and install postgres tsearch-extras module
@@ -349,7 +311,7 @@ sudo virtualenv /srv/zulip-py3-venv -p python3 # Create a python3 virtualenv
sudo chown -R `whoami`:`whoami` /srv/zulip-py3-venv
source /srv/zulip-py3-venv/bin/activate # Activate python3 virtualenv
pip install --upgrade pip # upgrade pip itself because older versions have known issues
pip install --no-deps -r requirements/dev.txt # install python packages required for development
pip install --no-deps -r requirements/dev_lock.txt # install python packages required for development
```
Now run these commands:
@@ -361,7 +323,6 @@ sudo mkdir /srv/zulip-emoji-cache
sudo chown -R `whoami`:`whoami` /srv/zulip-emoji-cache
./tools/setup/emoji/build_emoji
./tools/inline-email-css
./tools/generate-custom-icon-webfont
./tools/setup/build_pygments_data
./tools/setup/generate_zulip_bots_static_files
./scripts/setup/generate_secrets.py --development
@@ -404,63 +365,10 @@ proxy in the environment as follows:
yarn config set https-proxy http://proxy_host:port
```
## Installing on cloud9
AWS Cloud9 is a cloud-based integrated development environment (IDE)
that lets you write, run, and debug your code with just a browser. It
includes a code editor, debugger, and terminal.
This section documents how to setup the Zulip development environment
in a cloud9 workspace. If you don't have an existing cloud9 account,
you can sign up [here](https://aws.amazon.com/cloud9/).
* Create a Workspace, and select the blank template.
* Resize the workspace to be 1GB of memory and 4GB of disk
space. (This is under free limit for both the old Cloud9 and the AWS
Free Tier).
* Clone the zulip repo: `git clone --config pull.rebase
https://github.com/<your-username>/zulip.git`
* Restart rabbitmq-server since its broken on cloud9: `sudo service
rabbitmq-server restart`.
* And run provision `cd zulip && ./tools/provision`, once this is done.
* Activate the zulip virtual environment by `source
/srv/zulip-py3-venv/bin/activate` or by opening a new terminal.
#### Install zulip-cloud9
There's an NPM package, `zulip-cloud9`, that provides a wrapper around
the Zulip development server for use in the Cloud9 environment.
Note: `npm i -g zulip-cloud9` does not work in zulip's virtual
environment. Although by default, any packages installed in workspace
folder (i.e. the top level folder) are added to `$PATH`.
```bash
cd .. # switch to workspace folder if you are in zulip directory
npm i zulip-cloud9
zulip-dev start # to start the development server
```
If you get error of the form `bash: cannot find command zulip-dev`,
you need to start a new terminal.
Your development server would be running at
`https://<workspace-name>-<username>.c9users.io` on port 8080. You
dont need to add `:8080` to your url, since the cloud9 proxy should
automatically forward the connection. You might want to visit
[zulip-cloud9 repo](https://github.com/cPhost/zulip-cloud9) and it's
[wiki](https://github.com/cPhost/zulip-cloud9/wiki) for more info on
how to use zulip-cloud9 package.
## Using Docker (experimental)
Start by [cloning your fork of the Zulip repository][zulip-rtd-git-cloning]
and [connecting the Zulip upstream repository][zulip-rtd-git-connect]:
```
git clone --config pull.rebase https://github.com/YOURUSERNAME/zulip.git
git remote add -f upstream https://github.com/zulip/zulip.git
```
Start by cloning this repository: `git clone
https://github.com/zulip/zulip.git`
The docker instructions for development are experimental, so they may
have bugs. If you try them and run into any issues, please report
@@ -471,8 +379,8 @@ First, you need to install Docker in your development machine
following the [instructions][docker-install]. Some other interesting
links for somebody new in Docker are:
* [Get Started](https://docs.docker.com/get-started/)
* [Understand the architecture](https://docs.docker.com/engine/docker-overview/)
* [Get Started](https://docs.docker.com/engine/installation/linux/)
* [Understand the architecture](https://docs.docker.com/engine/understanding-docker/)
* [Docker run reference](https://docs.docker.com/engine/reference/run/)
* [Dockerfile reference](https://docs.docker.com/engine/reference/builder/)
@@ -503,7 +411,7 @@ docker run -itv $(pwd):/srv/zulip -p 9991:9991 user/zulipdev:v2 \
```
You'll want to
[read the guide for Zulip development](../development/setup-vagrant.html#step-4-developing)
[read the guide for Zulip development](dev-env-first-time-contributors.html#step-4-developing)
to understand how to use the Zulip development. Note that
`start-dockers` automatically runs `tools/run-dev.py` inside the
container; you can then visit http://localhost:9991 to connect to your
@@ -555,6 +463,3 @@ the results in your browser.
Currently, the Docker workflow is substantially less convenient than
the Vagrant workflow and less documented; please contribute to this
guide and the Docker tooling if you are using Docker to develop Zulip!
[zulip-rtd-git-cloning]: ../git/cloning.html#step-1b-clone-to-your-machine
[zulip-rtd-git-connect]: ../git/cloning.html#step-1c-connect-your-fork-to-zulip-upstream

View File

@@ -1,12 +0,0 @@
#######################
Development Environment
#######################
.. toctree::
:maxdepth: 3
Development environment installation <overview>
Recommended setup (Vagrant) <setup-vagrant>
Advanced Setup (non-Vagrant) <setup-advanced>
Using the development environment <using>
Developing remotely <remote>

View File

@@ -1,83 +0,0 @@
```eval_rst
:orphan:
```
# How to request a remote Zulip development instance
Under specific circumstances, typically during sprints, hackathons, and
Google Code-in, Zulip can provide you with a virtual machine with the
development environment already set up.
The machines (droplets) are being generously provided by
[Digital Ocean](https://www.digitalocean.com/). Thank you Digital Ocean!
## Step 1: Join GitHub and create SSH Keys
To contribute to Zulip and to use a remote Zulip developer instance, you'll
need a GitHub account. If you don't already have one, sign up
[here][github-join].
You'll also need to [create SSH keys and add them to your GitHub
account][github-help-add-ssh-key].
## Step 2: Create a fork of zulip/zulip
Zulip uses a **forked-repo** and **[rebase][gitbook-rebase]-oriented
workflow**. This means that all contributors create a fork of the [Zulip
repository][github-zulip-zulip] they want to contribute to and then submit pull
requests to the upstream repository to have their contributions reviewed and
accepted.
When we create your Zulip dev instance, we'll connect it to your fork of Zulip,
so that needs to exist before you make your request.
While you're logged in to GitHub, navigate to [zulip/zulip][github-zulip-zulip]
and click the **Fork** button. (See [GitHub's help article][github-help-fork]
for further details).
## Step 3: Make request via chat.zulip.org
Now that you have a GitHub account, have added your SSH keys, and forked
zulip/zulip, you are ready to request your Zulip developer instance.
If you haven't already, create an account on https://chat.zulip.org/.
Next, join the [development
help](https://chat.zulip.org/#narrow/stream/development.20help) stream. Create a
new **stream message** with your GitHub username as the **topic** and request
your remote dev instance. **Please make sure you have completed steps 1 and 2
before doing so**. A core developer should reply letting you know they're
working on creating it as soon as they are available to help.
Once requested, it will only take a few minutes to create your instance. You
will be contacted when it is complete and available.
## Next steps
Once your remote dev instance is ready:
- Connect to your server by running
`ssh zulipdev@<username>.zulipdev.org` on the command line
(Terminal for macOS and Linux, Bash for Git on Windows).
- There is no password; your account is configured to use your SSH keys.
- Once you log in, you should see `(zulip-py3-venv) ~$`.
- To start the dev server, `cd zulip` and then run `./tools/run-dev.py`.
- While the dev server is running, you can see the Zulip server in your browser
at http://username.zulipdev.org:9991.
Once you've confirmed you can connect to your remote server, take a look at:
* [developing remotely](../development/remote.html) for tips on using the remote dev
instance, and
* our [Git & GitHub Guide](../git/index.html) to learn how to use Git with Zulip.
Next, read the following to learn more about developing for Zulip:
* [Using the Development Environment](../development/using.html)
* [Testing](../testing/testing.html)
[github-join]: https://github.com/join
[github-help-add-ssh-key]: https://help.github.com/articles/adding-a-new-ssh-key-to-your-github-account/
[github-zulip-zulip]: https://github.com/zulip/zulip/
[github-help-fork]: https://help.github.com/articles/fork-a-repo/
[gitbook-rebase]: https://git-scm.com/book/en/v2/Git-Branching-Rebasing

View File

@@ -4,7 +4,7 @@ This page documents the Zulip directory structure, where to find
things, and how to decide where to put a file.
You may also find the [new application feature
tutorial](../tutorials/new-feature-tutorial.html) helpful for understanding the
tutorial](new-feature-tutorial.html) helpful for understanding the
flow through these files.
### Core Python files
@@ -25,16 +25,15 @@ paths will be familiar to Django developers.
* `zerver/views/*.py` Most [Django views](https://docs.djangoproject.com/en/1.8/topics/http/views/).
* `zerver/webhooks/` Webhook views and tests for [Zulip webhook integrations](
https://zulipchat.com/api/integration-guide).
* `zerver/webhooks/` Webhook views and tests for [Zulip webhook integrations](integration-guide.html).
* `zerver/tornado/views.py` Tornado views.
* `zerver/worker/queue_processors.py` [Queue workers](../subsystems/queuing.html).
* `zerver/worker/queue_processors.py` [Queue workers](queuing.html).
* `zerver/lib/*.py` Most library code.
* `zerver/lib/bugdown/` [Backend Markdown processor](../subsystems/markdown.html).
* `zerver/lib/bugdown/` [Backend Markdown processor](markdown.html).
* `zproject/backends.py` [Authentication backends](https://docs.djangoproject.com/en/1.8/topics/auth/customizing/).
@@ -42,7 +41,7 @@ paths will be familiar to Django developers.
### HTML Templates
See [our docs](../subsystems/html-templates.html) for details on Zulip's
See [our docs](html-templates.html) for details on Zulip's
templating systems.
* `templates/zerver/` For [Jinja2](http://jinja.pocoo.org/) templates
@@ -87,7 +86,7 @@ These are distinguished from scripts, below, by needing to run a
Django context (i.e. with database access).
* `zerver/management/commands/`
[Management commands](../subsystems/management-commands.html) one might run at a
[Management commands](management-commands.html) one might run at a
production deployment site (e.g. scripts to change a value or
deactivate a user properly).

View File

@@ -2,7 +2,7 @@
This page has developer documentation on the Zulip email system. If you're
trying to configure your server to send email, you might be looking for our
guide to [sending outgoing email](../production/email.html). If you're trying to
guide to [sending outgoing email](prod-email.html). If you're trying to
configure an email integration to receive incoming email (e.g. so that users
can reply to missed message emails via email), you might be interested in
our instructions for
@@ -40,7 +40,7 @@ Email takes about a quarter second per email to process and send. Generally
speaking, if you're sending just one email, doing it in the current process
is fine. If you're sending emails in a loop, you probably want to send it
from a queue. Documentation on our queueing system is available
[here](../subsystems/queuing.html).
[here](queuing.html).
## Development and testing
@@ -56,23 +56,6 @@ our custom backend, `EmailLogBackEnd`. It does the following:
* Print a friendly message on console advertising `/emails` to make
this nice and discoverable.
You can also forward all the emails sent in the development environment
to an email id of your choice by clicking on **Forward emails to a mail
account** in `/emails` page. This feature can be used for testing how
emails gets rendered by different email clients. Before enabling this
you have to first configure the following SMTP settings.
* The hostname `EMAIL_HOST` in `zproject/dev_settings.py`
* The username `EMAIL_HOST_USER` in `zproject/dev_settings.py`.
* The password `email_password` in `zproject/dev-secrets.conf`.
See [this](../production/email.html#free-outgoing-email-services)
section for instructions on obtaining SMTP details.
**Note: The base_image_uri of the images in forwarded emails would be replaced
with `https://chat.zulip.org/static/images/emails` inorder for the email clients
to render the images. See `zproject/email_backends.py` for more details.**
While running the backend test suite, we use
`django.core.mail.backends.locmem.EmailBackend` as the email
backend. The `locmem` backend stores messages in a special attribute

92
docs/emoji.md Normal file
View File

@@ -0,0 +1,92 @@
# Emoji
Emoji seem like a simple idea, but there's actually a ton of
complexity that goes into an effective emoji implementation. This
document discusses a number of these issues.
Currently, Zulip uses the Noto (Android) emoji set, but we are close
to being able to support the user choosing which emoji set they want
to use.
## Emoji codes
The Unicode standard has various ranges of characters set aside for
emoji. So you can put emoji in your terminal using actual unicode
characters like 😀 and 👍. If you paste those into Zulip, Zulip will
render them as the corresponding emoji image.
However, the Unicode committee did not standardize on a set of
human-readable names for emoji. So, for example, when using the
popular `:` based style for entering emoji from the keyboard, we have
to decide whether to use `:angry:` or `:angry_face:` to represent an
angry face. Different products use different approaches, but for
purposes like emoji pickers or autocomplete, you definitely want to
pick exactly one of these names, since otherwise users will always be
seeing duplicates of a given emoji next to each other.
Picking which emoji name to use is surprisingly complicated! Zulip
has a nice library, `tools/setup/emoji/emoji_setup_utils.py`, which we
use to make these decisions systematically, with a relatively small
list of hand-coded exceptions.
### Custom emoji
Zulip supports custom user-uploaded emoji. We manage those by having
the name of the emoji be its "emoji code", and using an emoji_type
field to keep track of it. We are in the progress of migrating Zulip
to refer to these emoji only by ID, which is a requirement for being
able to support deprecating old realm emoji in a sensible way.
## Tooling
We use the [iamcal emoji data package][iamcal] to provide sprite
sheets and individual images for our emoji, as well as a data set of
emoji categories, code points, names, etc. The sprite sheets are used
by the Zulip webapp to display emoji in messages, emoji reactions,
etc. However, we can't use the sprite sheets in some contexts, such
as missed-message and digestemails, that need to have self-contained
assets. For those, we use individual emoji files under
`static/generated/emoji`. The structure of that repository contains
both files named after the unicode representation of emoji (as actual
image files) as well as symlinks pointing to those emoji.
We need to maintain those both for the names used in the iamcal emoji
data set as well as our old emoji data set (`emoji_map.json`). Zulip
has a tool, `tools/setup/emoji/build_emoji`, that combines the
`emoji.json` file from iamcal with the old `emoji-map.json` data set
to construct the various symlink farms and output files described
below that support our emoji experience.
The `build_emoji` tool generates the set of files under
`static/generated/emoji` (or really, it generates the
`/srv/zulip-emoji-cache/<sha1>/emoji` tree, and
`static/generated/emoji` is a symlink to that tree; we do this in
order to cache old versions to make provisioning and production
deployments super fast in the common case that we haven't changed the
emoji tooling). See [our dependencies document](dependencies.html)
for more details on this strategy.
The emoji tree generated by this process contains several import elements:
* `emoji_codes.js`: A set of mappings used by the Zulip frontend to
understand what unicode emoji exist and what their shortnames are,
used for autocomplete, emoji pickers, etc. This has been
deduplicated using the logic in
`tools/setup/emoji/emoji_setup_utils.py` to generally only have
`:angry:` and not also `:angry_face:`, since having both is ugly and
pointless for purposes like autocomplete and emoji pickers.
* `images/emoji/unicode/*.png`: A farm of emoji
* `images/emoji/*.png`: A farm of symlinks from emoji names to the
`images/emoji/unicode/` tree. This is used to serve individual emoji
images, as well as for the
[backend markdown processor](markdown.html) to know which emoji
names exist and what unicode emoji / images they map to. In this
tree, we currently include all of the emoji in `emoji-map.json`;
this means that if you send `:angry_face:`, it won't autocomplete,
but will still work (but not in previews).
* Some CSS and PNGs for the emoji spritesheets, used in Zulip for
emoji pickers where we would otherwise need to download over 1000 of
individual emoji images (which would cause a browser performance
problem). We have multiple spritesheets: one for each emoji
provider that we support (Google, Twitter, EmojiOne, etc.).
[iamcal]: https://github.com/iamcal/emoji-data

View File

@@ -4,7 +4,7 @@ Zulip's "events system" is the server-to-client push system that
powers our real-time sync. This document explains how it works; to
read an example of how a complete feature using this system works,
check out the
[new application feature tutorial](../tutorials/new-feature-tutorial.html).
[new application feature tutorial](new-feature-tutorial.html).
Any single-page web application like Zulip needs a story for how
changes made by one client are synced to other clients, though having
@@ -85,7 +85,7 @@ wide range of possible clients, and make it easy for developers.
Zulip's event delivery (real-time push) system is based on Tornado,
which is ideal for handling a large number of open requests. Details
on Tornado are available in the
[architecture overview](../overview/architecture-overview.html), but in short it
[architecture overview](architecture-overview.html), but in short it
is good at holding open a large number of connections for a long time.
The complete system is about 1500 lines of code in `zerver/tornado/`,
primarily `zerver/tornado/event_queue.py`.
@@ -125,7 +125,7 @@ parameters, updating `last_event_id` each time to acknowledge any
events it has received (see `call_on_each_event` in the
[Zulip Python API bindings][api-bindings-code] for a complete example
implementation). When handling each `GET /json/events` request, the
queue server can safely delete any events that have an event ID less
queue server can safely delete any events have have an event ID less
than or equal to the client's `last_event_id` (event IDs are just a
counter for the events a given queue has received.)
@@ -140,7 +140,7 @@ those events could be lost).
[api-bindings-code]: https://github.com/zulip/python-zulip-api/blob/master/zulip/zulip/__init__.py
The queue servers are a very high-traffic system, processing at a
minimum one request for every message delivered to every Zulip client.
minimum a request for every message delivered to every Zulip client.
Additionally, as a workaround for low-quality NAT servers that kill
HTTP connections that are open without activity for more than 60s, the
queue servers also send a heartbeat event to each queue at least once

View File

@@ -1,7 +1,3 @@
```eval_rst
:orphan:
```
# Running expensive migrations early
Zulip 1.7 contains some significant database migrations that can take

View File

@@ -3,9 +3,9 @@
This page documents additional information that may be useful when
developing new features for Zulip that require front-end changes,
especially those that involve adding new files. For a more general
overview, see the [new feature tutorial](../tutorials/new-feature-tutorial.html).
overview, see the [new feature tutorial](new-feature-tutorial.html).
Our [dependencies documentation](../subsystems/dependencies.html) has useful
Our [dependencies documentation](dependencies.html) has useful
relevant background as well.
## Primary build process

View File

@@ -199,7 +199,7 @@ with Zulip could think of this as streams that are online.
* Bot - **Bot**
Not only is "bot" a short and easily memorable term, it is also widely used
Not only is "bot" a short and easily rememberable term, it is also widely used
in German technology magazines, forums, etc.
*"Bot" (Transifex, Heise, Die Zeit)*

View File

@@ -0,0 +1,54 @@
# Git Cheat Sheet (Detailed)
See also
[fixing commits][fix-commit]
Commands:
- add
- `git add foo.py`: add `foo.py` to the staging area
- `git add foo.py bar.py`: add `foo.py` AND `bar.py` to the staging area
- checkout
- `git checkout -b new-branch-name`: create branch `new-branch-name` and switch/checkout to that new branch
- `git checkout master`: switch to your `master` branch
- `git checkout old-branch-name`: switch to an existing branch `old-branch-name`
- commit
- `git commit --amend`: changing the last commit message. Read more [here][fix-commit]
- config
- `git config --global core.editor nano`: set core editor to `nano` (you can set this to `vim` or others)
- `git config --global core.symlinks true`: allow symbolic links
- diff
- `git diff`: display the changes you have made to all files
- `git diff --cached`: display the changes you have made to staged files
- `git diff HEAD~2..`: display the 2 most recent changes you have made to files
- fetch
- `git fetch origin`: fetch origin repository
- `git fetch upstream`: fetch upstream repository
- grep
- `git grep update_unread_counts -- '*.js'`: search all files (ending in `.js`) for `update_unread_counts`
- log
- `git log`: show commit logs
- pull
- **do not use for Zulip**
- push
- `git push origin +branch-name`: push your commits to your origin repository
- rebase
- `git rebase -i HEAD~3`: interactive rebasing current branch with first three items on HEAD
- `git rebase -i master`: interactive rebasing current branch with master branch
- `git rebase upstream/master`: rebasing current branch with master branch from upstream repository
- reflog
- `git reflog | head -10`: manage reference logs for the past 10 commits
- remote
- `git remote -v`: display your origin and upstream repositories
- reset
- `git reset HEAD~2`: reset two most recent commits
- rm
- `git rm oops.txt`: remove `oops.txt`
- show
- `git show HEAD`: display most recent commit
- `git show HEAD~~~`: display third most recent commit
- `git show master`: display most recent commit on `master`
- status
- `git status`: show the working tree status, unstaged and staged files
[fix-commit]: fixing-commits.html

52
docs/git-cheat-sheet.md Normal file
View File

@@ -0,0 +1,52 @@
# Git Cheat Sheet
See also [fixing commits][fix-commit]
Commands:
- add
- `git add foo.py`
- checkout
- `git checkout -b new-branch-name`
- `git checkout master`
- `git checkout old-branch-name`
- commit
- `git commit --amend`
- config
- `git config --global core.editor nano`
- `git config --global core.symlinks true`
- diff
- `git diff`
- `git diff --cached`
- `git diff HEAD~2..`
- fetch
- `git fetch origin`
- `git fetch upstream`
- grep
- `git grep update_unread_counts -- '*.js'`
- log
- `git log`
- pull
- **do not use for Zulip**
- push
- `git push origin +branch-name`
- rebase
- `git rebase -i HEAD~3`
- `git rebase -i master`
- `git rebase upstream/master`
- reflog
- `git reflog | head -10`
- remote
- `git remote -v`
- reset
- `git reset HEAD~2`
- rm
- `git rm oops.txt`
- show
- `git show HEAD`
- `git show HEAD~~~`
- `git show master`
- status
- `git status`
[fix-commit]: fixing-commits.html

1569
docs/git-guide.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,116 +0,0 @@
# Git Cheat Sheet
See also [fixing commits][fix-commit]
## Common Commands
- add
- `git add foo.py`
- checkout
- `git checkout -b new-branch-name`
- `git checkout master`
- `git checkout old-branch-name`
- commit
- `git commit -m "topic: Commit message title."`
- `git commit --amend`: Modify the previous commit.
- config
- `git config --global core.editor nano`
- `git config --global core.symlinks true`
- diff
- `git diff`
- `git diff --cached`
- `git diff HEAD~2..`
- fetch
- `git fetch origin`
- `git fetch upstream`
- grep
- `git grep update_unread_counts -- '*.js'`
- log
- `git log`
- pull
- `git pull --rebase`: **Use this**. Zulip uses a [rebase oriented workflow][git-overview].
- `git pull` (with no options): Will either create a merge commit
(which you don't want) or do the same thing as `git pull --rebase`,
depending on [whether you're configured Git properly][git-clone-config]
- push
- `git push origin +branch-name`
- rebase
- `git rebase -i HEAD~3`
- `git rebase -i master`
- `git rebase upstream/master`
- reflog
- `git reflog | head -10`
- remote
- `git remote -v`
- reset
- `git reset HEAD~2`
- rm
- `git rm oops.txt`
- show
- `git show HEAD`
- `git show HEAD~~~`
- `git show master`
- status
- `git status`
## Detailed Cheat Sheet
- add
- `git add foo.py`: add `foo.py` to the staging area
- `git add foo.py bar.py`: add `foo.py` AND `bar.py` to the staging area
- `git add -u`: Adds all tracked files to the staging area.
- checkout
- `git checkout -b new-branch-name`: create branch `new-branch-name` and switch/checkout to that new branch
- `git checkout master`: switch to your `master` branch
- `git checkout old-branch-name`: switch to an existing branch `old-branch-name`
- commit
- `git commit -m "commit message"`: It is recommended to type a
multiline commit message, however.
- `git commit`: Opens your default text editor to write a commit message.
- `git commit --amend`: changing the last commit message. Read more [here][fix-commit]
- config
- `git config --global core.editor nano`: set core editor to `nano` (you can set this to `vim` or others)
- `git config --global core.symlinks true`: allow symbolic links
- diff
- `git diff`: display the changes you have made to all files
- `git diff --cached`: display the changes you have made to staged files
- `git diff HEAD~2..`: display the 2 most recent changes you have made to files
- fetch
- `git fetch origin`: fetch origin repository
- `git fetch upstream`: fetch upstream repository
- grep
- `git grep update_unread_counts -- '*.js'`: search all files (ending in `.js`) for `update_unread_counts`
- log
- `git log`: show commit logs
- `git log --oneline | head`: To quickly see the latest ten commits on a branch.
- pull
- `git pull --rebase`: rebase your changes on top of master.
- `git pull` (with no options): Will either create a merge commit
(which you don't want) or do the same thing as `git pull --rebase`,
depending on [whether you're configured Git properly][git-clone-config]
- push
- `git push origin branch-name`: push you commits to the origin repository *only if* there are no conflicts.
Use this when collaborating with others to prevent overwriting their work.
- `git push origin +branch-name`: force push your commits to your origin repository.
- rebase
- `git rebase -i HEAD~3`: interactive rebasing current branch with first three items on HEAD
- `git rebase -i master`: interactive rebasing current branch with master branch
- `git rebase upstream/master`: rebasing current branch with master branch from upstream repository
- reflog
- `git reflog | head -10`: manage reference logs for the past 10 commits
- remote
- `git remote -v`: display your origin and upstream repositories
- reset
- `git reset HEAD~2`: reset two most recent commits
- rm
- `git rm oops.txt`: remove `oops.txt`
- show
- `git show HEAD`: display most recent commit
- `git show HEAD~~~`: display third most recent commit
- `git show master`: display most recent commit on `master`
- status
- `git status`: show the working tree status, unstaged and staged files
[fix-commit]: fixing-commits.html
[git-config-clone]: cloning.html#step-1b-clone-to-your-machine
[git-overview]: overview.html

View File

@@ -1,141 +0,0 @@
# Get Zulip code
Zulip uses a **forked-repo** and **[rebase][gitbook-rebase]-oriented
workflow.**. This means that all contributors create a fork of the [Zulip
repository][github-zulip] they want to contribute to and then submit pull
requests to the upstream repository to have their contributions reviewed and
accepted. We also recommend you work on feature branches.
## Step 1a: Create your fork
The following steps you'll only need to do the first time you setup a machine
for contributing to a given Zulip project. You'll need to repeat the steps for
any additional Zulip projects ([list][github-zulip]) that you work on.
The first thing you'll want to do to contribute to Zulip is fork ([see
how][github-help-fork]) the appropriate [Zulip repository][github-zulip]. For
the main server app, this is [zulip/zulip][github-zulip-zulip].
## Step 1b: Clone to your machine
Next, clone your fork to your local machine:
```
$ git clone --config pull.rebase git@github.com:christi3k/zulip.git
Cloning into 'zulip'
remote: Counting objects: 86768, done.
remote: Compressing objects: 100% (15/15), done.
remote: Total 86768 (delta 5), reused 1 (delta 1), pack-reused 86752
Receiving objects: 100% (86768/86768), 112.96 MiB | 523.00 KiB/s, done.
Resolving deltas: 100% (61106/61106), done.
Checking connectivity... done.
```
(The `--config pull.rebase` option configures Git so that `git pull`
will behave like `git pull --rebase` by default. Using `git pull
--rebase` to update your changes to resolve merge conflicts is
expected by essentially all of open source projects, including Zulip.
You can also set that option after cloning using `git config --add
pull.rebase true`, or just be careful to always run `git pull
--rebase`, never `git pull`).
Note: If you receive an error while cloning, you may not have [added your ssh
key to GitHub][github-help-add-ssh-key].
Once the repository is cloned, we recommend running
[setup-git-repo][zulip-rtd-tools-setup] to install Zulip's pre-commit
hook which runs the Zulip linters on the changed files when you
commit.
## Step 1c: Connect your fork to Zulip upstream
Next you'll want to [configure an upstream remote
repository][github-help-conf-remote] for your fork of Zulip. This will allow
you to [sync changes][github-help-sync-fork] from the main project back into
your fork.
First, show the currently configured remote repository:
```
$ git remote -v
origin git@github.com:YOUR_USERNAME/zulip.git (fetch)
origin git@github.com:YOUR_USERNAME/zulip.git (push)
```
Note: If you've cloned the repository using a graphical client, you may already
have the upstream remote repository configured. For example, when you clone
[zulip/zulip][github-zulip-zulip] with the GitHub desktop client it configures
the remote repository `zulip` and you see the following output from `git remote
-v`:
```
origin git@github.com:YOUR_USERNAME/zulip.git (fetch)
origin git@github.com:YOUR_USERNAME/zulip.git (push)
zulip https://github.com/zulip/zulip.git (fetch)
zulip https://github.com/zulip/zulip.git (push)
```
If your client hasn't automatically configured a remote for zulip/zulip, you'll
need to with:
```
$ git remote add -f upstream https://github.com/zulip/zulip.git
```
Finally, confirm that the new remote repository, upstream, has been configured:
```
$ git remote -v
origin git@github.com:YOUR_USERNAME/zulip.git (fetch)
origin git@github.com:YOUR_USERNAME/zulip.git (push)
upstream https://github.com/zulip/zulip.git (fetch)
upstream https://github.com/zulip/zulip.git (push)
```
## Step 2: Set up the Zulip development environment
If you haven't already, now is a good time to install the Zulip development environment
([overview][zulip-rtd-dev-overview]). If you're new to working on Zulip or open
source projects in general, we recommend following our [detailed guide for
first-time contributors][zulip-rtd-dev-first-time].
## Step 3: Configure Travis CI (continuous integration)
This step is optional, but recommended.
The Zulip Server project is configured to use [Travis CI][travis-ci]
to test and create builds upon each new commit and pull
request. Travis CI is free for open source projects and it's easy to
configure for your own fork of Zulip. After doing so, Travis CI will
run tests for new refs you push to GitHub and email you the outcome
(you can also view the results in the web interface).
Running Travis CI against your fork can help save both you and the
Zulip maintainers time by making it easy to test a change fully before
submitting a pull request. We generally recommend a worfklow where as
you make changes, you use a fast edit-refresh cycle running individual
tests locally until your changes work. But then once you've gotten
the tests you'd expect to be relevant to your changes working, push a
branch to Travis CI to run the full test suite before you create a
pull request. While you wait for Travis CI to run, you can start
working on your next task. When the tests finish, you can create a
pull request that you already know passes the tests.
First, sign in to [Travis CI][travis-ci] with your GitHub account and authorize
Travis CI to access your GitHub account and repositories. Once you've done
this, Travis CI will fetch your repository information and display it on your
[profile page][travis-ci-profile]. From there you can enable integration with
Zulip. ([See screen cast](../_static/zulip-travisci.gif).)
[gitbook-rebase]: https://git-scm.com/book/en/v2/Git-Branching-Rebasing
[github-help-add-ssh-key]: https://help.github.com/articles/adding-a-new-ssh-key-to-your-github-account/
[github-help-conf-remote]: https://help.github.com/articles/configuring-a-remote-for-a-fork/
[github-help-fork]: https://help.github.com/articles/fork-a-repo/
[github-help-sync-fork]: https://help.github.com/articles/syncing-a-fork/
[github-zulip]: https://github.com/zulip/
[github-zulip-zulip]: https://github.com/zulip/zulip/
[travis-ci]: https://travis-ci.org/
[travis-ci-profile]: https://travis-ci.org/profile
[zulip-rtd-dev-first-time]: ../development/setup-vagrant.html
[zulip-rtd-dev-overview]: ../development/overview.html
[zulip-rtd-tools-setup]: ../git/zulip-tools.html#set-up-git-repo-script

View File

@@ -1,57 +0,0 @@
# Collaborate
## Fetch another contributor's branch
What happens when you would like to collaborate with another contributor and
they have work-in-progress on their own fork of Zulip? No problem! Just add
their fork as a remote and pull their changes.
```
$ git remote add <username> https://github.com/<username>/zulip.git
$ git fetch <username>
```
Now you can checkout their branch just like you would any other. You can name
the branch anything you want, but using both the username and branch name will
help you keep things organized.
```
$ git checkout -b <username>/<branchname>
```
You can choose to rename the branch if you prefer:
```
git checkout -b <custombranchname> <username>/<branchname>
```
## Checkout a pull request locally
Just as you can checkout any user's branch locally, you can also checkout any
pull request locally. GitHub provides a special syntax
([details][github-help-co-pr-locally]) for this since pull requests are
specific to GitHub rather than Git.
First, fetch and create a branch for the pull request, replacing *ID* and
*BRANCHNAME* with the ID of the pull request and your desired branch name:
```
$ git fetch upstream pull/ID/head:BRANCHNAME
```
Now switch to the branch:
```
$ git checkout BRANCHNAME
```
Now you work on this branch as you would any other.
Note: you can use the scripts provided in the tools/ directory to fetch pull
requests. You can read more about what they do [here][tools-PR].
```
tools/fetch-rebase-pull-request <PR-number>
tools/fetch-pull-request <PR-number>
```
[github-help-co-pr-locally]: https://help.github.com/articles/checking-out-pull-requests-locally/
[tools-PR]: ../git/zulip-tools.html#fetch-a-pull-request-and-rebase

View File

@@ -1,21 +0,0 @@
#########
Git Guide
#########
.. toctree::
:maxdepth: 3
Quick Start <overview>
Set up Git <setup>
How Git is different <the-git-difference>
Important Git terms <terminology>
Get Zulip code <cloning>
Working copies <working-copies>
Using Git as you work <using>
Pull Requests <pull-requests>
Collaborate <collaborate>
Fixing commits <fixing-commits>
Reviewing changes <reviewing>
Get and stay out of trouble <troubleshooting>
Zulip-specific-tools <zulip-tools>
Git Cheat Sheet <cheat-sheet>

View File

@@ -1,71 +0,0 @@
# Quick start: How Zulip uses Git and GitHub
This quick start provides a brief overview of how Zulip uses Git and GitHub.
Those who are familiar with Git and GitHub should be able to start contributing
with these details in mind:
- We use **GitHub for source control and code review.** To contribute, fork
[zulip/zulip][github-zulip-zulip] (or the appropriate
[repository][github-zulip], if you are working on something else besides
Zulip server) to your own account and then create feature/issue branches.
When you're ready to get feedback, submit a work-in-progress (WIP) pull
request. *We encourage you to submit WIP pull requests early and often.*
- We use a **[rebase][gitbook-rebase]-oriented workflow.** We do not use merge
commits. This means you should use `git fetch` followed by `git rebase`
rather than `git pull` (or you can use `git pull --rebase`). Also, to prevent
pull requests from becoming out of date with the main line of development,
you should rebase your feature branch prior to submitting a pull request, and
as needed thereafter. If you're unfamiliar with how to rebase a pull request,
[read this excellent guide][github-rebase-pr].
We use this strategy in order to avoid the extra commits that appear
when another branch is merged, that clutter the commit history (it's
popular with other large projects such as Django). This makes
Zulip's commit history more readable, but a side effect is that many
pull requests we merge will be reported by GitHub's UI as *closed*
instead of *merged*, since GitHub has poor support for
rebase-oriented workflows.
- We have a **[code style guide][zulip-rtd-code-style]**, a **[commit message
guide][zulip-rtd-commit-messages]**, and strive for each commit to be *a
minimal coherent idea* (see **[commit
discipline][zulip-rtd-commit-discipline]** for details).
- We provide **many tools to help you submit quality code.** These include
[linters][zulip-rtd-lint-tools], [tests][zulip-rtd-testing], continuous
integration with [Travis CI][travis-ci], and [mypy][zulip-rtd-mypy].
- We use [zulipbot][zulip-rtd-zulipbot-usage] to manage our issues and
pull requests to create a better GitHub workflow for contributors.
- We provide some handy **[Zulip-specific Git scripts][zulip-rtd-zulip-tools]**
for developers to easily do tasks like fetching and rebasing a pull
request, cleaning unimportant branches, etc. These reduce the common
tasks of testing other contributors' pull requests to single commands.
Finally, install the [Zulip developer environment][zulip-rtd-dev-overview], and then
[configure your fork for use with Travis CI][zulip-git-guide-travisci].
***
The following sections will help you be awesome with Zulip and Git/GitHub in a
rebased-based workflow. Read through it if you're new to git, to a rebase-based
git workflow, or if you'd like a git refresher.
[gitbook-rebase]: https://git-scm.com/book/en/v2/Git-Branching-Rebasing
[github-rebase-pr]: https://github.com/edx/edx-platform/wiki/How-to-Rebase-a-Pull-Request
[github-zulip]: https://github.com/zulip/
[github-zulip-zulip]: https://github.com/zulip/zulip/
[travis-ci]: https://travis-ci.org/
[zulip-git-guide-travisci]: ../git/cloning.html#step-3-configure-travis-ci-continuous-integration
[zulip-rtd-code-style]: ../contributing/code-style.html
[zulip-rtd-commit-discipline]: ../contributing/version-control.html#commit-discipline
[zulip-rtd-commit-messages]: ../contributing/version-control.html#commit-messages
[zulip-rtd-dev-overview]: ../development/overview.html
[zulip-rtd-lint-tools]: ../contributing/code-style.html#lint-tools
[zulip-rtd-mypy]: ../contributing/mypy.html
[zulip-rtd-testing]: ../testing/testing.html
[zulip-rtd-zulip-tools]: ../git/zulip-tools.html
[zulip-rtd-zulipbot-usage]: ../contributing/zulipbot-usage.html

View File

@@ -1,152 +0,0 @@
# Create a pull request
When you're ready for feedback, submit a pull request. Pull requests
are a feature specific to GitHub. They provide a simple, web-based way
to submit your work (often called "patches") to a project. It's called
a *pull request* because you're asking the project to *pull changes*
from your fork.
If you're unfamiliar with how to create a pull request, you can check
out GitHub's documentation on
[creating a pull request from a fork][github-help-create-pr-fork]. You
might also find GitHub's article
[about pull requests][github-help-about-pr] helpful. That all said,
the tutorial below will walk you through the process.
## Work in progress pull requests
In the Zulip project, we encourage submitting work-in-progress pull
requests early and often. This allows you to share your code to make
it easier to get feedback and help with your changes. Prefix the
titles of work-in-progress pull requests with **[WIP]**, which in our
project means that you don't think your pull request is ready to be
merged (e.g. it might not work or pass tests). This sets expectations
correctly for any feedback from other developers, and prevents your
work from being merged before you're confident in it.
## Create a pull request
### Step 1: Update your branch with git rebase
The best way to update your branch is with `git fetch` and `git rebase`. Do not
use `git pull` or `git merge` as this will create merge commits. See [keep your
fork up to date][keep-up-to-date] for details.
Here's an example (you would replace *issue-123* with the name of your feature branch):
```
$ git checkout issue-123
Switched to branch 'issue-123'
$ git fetch upstream
remote: Counting objects: 69, done.
remote: Compressing objects: 100% (23/23), done.
remote: Total 69 (delta 49), reused 39 (delta 39), pack-reused 7
Unpacking objects: 100% (69/69), done.
From https://github.com/zulip/zulip
69fa600..43e21f6 master -> upstream/master
$ git rebase upstream/master
First, rewinding head to replay your work on top of it...
Applying: troubleshooting tip about provisioning
```
### Step 2: Push your updated branch to your remote fork
Once you've updated your local feature branch, push the changes to GitHub:
```
$ git push origin issue-123
Counting objects: 6, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (4/4), done.
Writing objects: 100% (6/6), 658 bytes | 0 bytes/s, done.
Total 6 (delta 3), reused 0 (delta 0)
remote: Resolving deltas: 100% (3/3), completed with 1 local objects.
To git@github.com:christi3k/zulip.git
+ 2d49e2d...bfb2433 issue-123 -> issue-123
```
If your push is rejected with error **failed to push some refs** then you need
to prefix the name of your branch with a `+`:
```
$ git push origin +issue-123
Counting objects: 6, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (4/4), done.
Writing objects: 100% (6/6), 658 bytes | 0 bytes/s, done.
Total 6 (delta 3), reused 0 (delta 0)
remote: Resolving deltas: 100% (3/3), completed with 1 local objects.
To git@github.com:christi3k/zulip.git
+ 2d49e2d...bfb2433 issue-123 -> issue-123 (forced update)
```
This is perfectly okay to do on your own feature branches, especially if you're
the only one making changes to the branch. If others are working along with
you, they might run into complications when they retrieve your changes because
anyone who has based their changes off a branch you rebase will have to do a
complicated rebase.
### Step 3: Open the pull request
If you've never created a pull request or need a refresher, take a look at
GitHub's article [creating a pull request from a
fork][github-help-create-pr-fork]. We'll briefly review the process here.
The first step in creating a pull request is to use your web browser to
navigate to your fork of Zulip. Sign in to GitHub if you haven't already.
Next, navigate to the branch you've been working on. Do this by clicking on the
**Branch** button and selecting the relevant branch. Finally, click the **New
pull request** button.
Alternatively, if you've recently pushed to your fork, you will see a green
**Compare & pull request** button.
You'll see the *Open a pull request* page:
![images-create-pr]
Provide a **title** and first comment for your pull request. Remember to prefix
your pull request title with [WIP] if it is a [work-in-progress][wip-prs].
If your pull request has an effect on the visuals of a component, you might want
to include a screenshot of this change or a GIF of the interaction in your first
comment. This will allow reviewers to comment on your changes without having to
checkout your branch; you can find a list of tools you can use for this over
[here][screenshots-gifs].
When ready, click the green **Create pull request** to submit the pull request.
Note: **Pull request titles are different from commit messages.** Commit
messages can be edited with `git commit --amend`, `git rebase -i`, etc., while
the title of a pull request can only be edited via GitHub.
## Update a pull request
As you get make progress on your feature or bugfix, your pull request, once
submitted, will be updated each time you [push commits][self-push-commits] to
your remote branch. This means you can keep your pull request open as long as
you need, rather than closing and opening new ones for the same feature or
bugfix.
It's a good idea to keep your pull request mergeable with Zulip upstream by
frequently fetching, rebasing, and pushing changes. See [keep your fork up to
date][keep-up-to-date] for details. You might also find this excellent
article [How to Rebase a Pull Request][edx-howto-rebase-pr] helpful.
And, as you address review comments others have made, we recommend posting a
follow-up comment in which you: a) ask for any clarifications you need, b)
explain to the reviewer how you solved any problems they mentioned, and c) ask
for another review.
[edx-howto-rebase-pr]: https://github.com/edx/edx-platform/wiki/How-to-Rebase-a-Pull-Request
[github-help-about-pr]: https://help.github.com/articles/about-pull-requests/
[github-help-create-pr-fork]: https://help.github.com/articles/creating-a-pull-request-from-a-fork/
[images-create-pr]: ../images/zulip-open-pr.png
[keep-up-to-date]: ../git/using.html#keep-your-fork-up-to-date
[push-commits]: ../git/using.html#push-your-commits-to-github
[screenshots-gifs]: ../tutorials/screenshot-and-gif-software.html
[wip-prs]: #work-in-progress-pull-requests

Some files were not shown because too many files have changed in this diff Show More